metaflow-stubs 2.18.12__py2.py3-none-any.whl → 2.19.0__py2.py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of metaflow-stubs might be problematic. Click here for more details.

Files changed (168) hide show
  1. metaflow-stubs/__init__.pyi +503 -502
  2. metaflow-stubs/cards.pyi +2 -2
  3. metaflow-stubs/cli.pyi +2 -2
  4. metaflow-stubs/cli_components/__init__.pyi +2 -2
  5. metaflow-stubs/cli_components/utils.pyi +2 -2
  6. metaflow-stubs/client/__init__.pyi +3 -2
  7. metaflow-stubs/client/core.pyi +62 -31
  8. metaflow-stubs/client/filecache.pyi +20 -4
  9. metaflow-stubs/events.pyi +7 -8
  10. metaflow-stubs/exception.pyi +2 -2
  11. metaflow-stubs/flowspec.pyi +7 -8
  12. metaflow-stubs/generated_for.txt +1 -1
  13. metaflow-stubs/includefile.pyi +4 -4
  14. metaflow-stubs/meta_files.pyi +2 -2
  15. metaflow-stubs/metadata_provider/__init__.pyi +2 -2
  16. metaflow-stubs/metadata_provider/heartbeat.pyi +2 -2
  17. metaflow-stubs/metadata_provider/metadata.pyi +2 -2
  18. metaflow-stubs/metadata_provider/util.pyi +2 -2
  19. metaflow-stubs/metaflow_config.pyi +16 -2
  20. metaflow-stubs/metaflow_current.pyi +32 -34
  21. metaflow-stubs/metaflow_git.pyi +3 -5
  22. metaflow-stubs/multicore_utils.pyi +4 -4
  23. metaflow-stubs/packaging_sys/__init__.pyi +39 -40
  24. metaflow-stubs/packaging_sys/backend.pyi +13 -13
  25. metaflow-stubs/packaging_sys/distribution_support.pyi +6 -7
  26. metaflow-stubs/packaging_sys/tar_backend.pyi +11 -11
  27. metaflow-stubs/packaging_sys/utils.pyi +3 -6
  28. metaflow-stubs/packaging_sys/v1.pyi +4 -4
  29. metaflow-stubs/parameters.pyi +6 -9
  30. metaflow-stubs/plugins/__init__.pyi +11 -11
  31. metaflow-stubs/plugins/airflow/__init__.pyi +2 -2
  32. metaflow-stubs/plugins/airflow/airflow_decorator.pyi +2 -2
  33. metaflow-stubs/plugins/airflow/airflow_utils.pyi +2 -2
  34. metaflow-stubs/plugins/airflow/exception.pyi +2 -2
  35. metaflow-stubs/plugins/airflow/sensors/__init__.pyi +2 -2
  36. metaflow-stubs/plugins/airflow/sensors/base_sensor.pyi +2 -2
  37. metaflow-stubs/plugins/airflow/sensors/external_task_sensor.pyi +2 -2
  38. metaflow-stubs/plugins/airflow/sensors/s3_sensor.pyi +2 -2
  39. metaflow-stubs/plugins/argo/__init__.pyi +2 -2
  40. metaflow-stubs/plugins/argo/argo_client.pyi +2 -2
  41. metaflow-stubs/plugins/argo/argo_events.pyi +2 -2
  42. metaflow-stubs/plugins/argo/argo_workflows.pyi +3 -3
  43. metaflow-stubs/plugins/argo/argo_workflows_decorator.pyi +4 -4
  44. metaflow-stubs/plugins/argo/argo_workflows_deployer.pyi +5 -3
  45. metaflow-stubs/plugins/argo/argo_workflows_deployer_objects.pyi +11 -9
  46. metaflow-stubs/plugins/argo/exit_hooks.pyi +4 -7
  47. metaflow-stubs/plugins/aws/__init__.pyi +3 -3
  48. metaflow-stubs/plugins/aws/aws_client.pyi +2 -2
  49. metaflow-stubs/plugins/aws/aws_utils.pyi +2 -2
  50. metaflow-stubs/plugins/aws/batch/__init__.pyi +2 -2
  51. metaflow-stubs/plugins/aws/batch/batch.pyi +2 -2
  52. metaflow-stubs/plugins/aws/batch/batch_client.pyi +2 -2
  53. metaflow-stubs/plugins/aws/batch/batch_decorator.pyi +2 -2
  54. metaflow-stubs/plugins/aws/secrets_manager/__init__.pyi +2 -2
  55. metaflow-stubs/plugins/aws/secrets_manager/aws_secrets_manager_secrets_provider.pyi +3 -3
  56. metaflow-stubs/plugins/aws/step_functions/__init__.pyi +2 -2
  57. metaflow-stubs/plugins/aws/step_functions/dynamo_db_client.pyi +2 -2
  58. metaflow-stubs/plugins/aws/step_functions/event_bridge_client.pyi +2 -2
  59. metaflow-stubs/plugins/aws/step_functions/schedule_decorator.pyi +2 -2
  60. metaflow-stubs/plugins/aws/step_functions/step_functions.pyi +2 -2
  61. metaflow-stubs/plugins/aws/step_functions/step_functions_client.pyi +2 -2
  62. metaflow-stubs/plugins/aws/step_functions/step_functions_decorator.pyi +2 -2
  63. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer.pyi +5 -3
  64. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer_objects.pyi +9 -7
  65. metaflow-stubs/plugins/azure/__init__.pyi +2 -2
  66. metaflow-stubs/plugins/azure/azure_credential.pyi +2 -2
  67. metaflow-stubs/plugins/azure/azure_exceptions.pyi +2 -2
  68. metaflow-stubs/plugins/azure/azure_secret_manager_secrets_provider.pyi +3 -3
  69. metaflow-stubs/plugins/azure/azure_utils.pyi +2 -2
  70. metaflow-stubs/plugins/azure/blob_service_client_factory.pyi +2 -2
  71. metaflow-stubs/plugins/azure/includefile_support.pyi +2 -2
  72. metaflow-stubs/plugins/cards/__init__.pyi +2 -2
  73. metaflow-stubs/plugins/cards/card_client.pyi +5 -6
  74. metaflow-stubs/plugins/cards/card_creator.pyi +2 -2
  75. metaflow-stubs/plugins/cards/card_datastore.pyi +4 -2
  76. metaflow-stubs/plugins/cards/card_decorator.pyi +2 -2
  77. metaflow-stubs/plugins/cards/card_modules/__init__.pyi +2 -2
  78. metaflow-stubs/plugins/cards/card_modules/basic.pyi +2 -2
  79. metaflow-stubs/plugins/cards/card_modules/card.pyi +2 -2
  80. metaflow-stubs/plugins/cards/card_modules/components.pyi +14 -14
  81. metaflow-stubs/plugins/cards/card_modules/convert_to_native_type.pyi +2 -2
  82. metaflow-stubs/plugins/cards/card_modules/json_viewer.pyi +4 -4
  83. metaflow-stubs/plugins/cards/card_modules/renderer_tools.pyi +2 -2
  84. metaflow-stubs/plugins/cards/card_modules/test_cards.pyi +2 -2
  85. metaflow-stubs/plugins/cards/card_resolver.pyi +2 -2
  86. metaflow-stubs/plugins/cards/component_serializer.pyi +2 -2
  87. metaflow-stubs/plugins/cards/exception.pyi +2 -2
  88. metaflow-stubs/plugins/catch_decorator.pyi +3 -3
  89. metaflow-stubs/plugins/datatools/__init__.pyi +2 -2
  90. metaflow-stubs/plugins/datatools/local.pyi +2 -2
  91. metaflow-stubs/plugins/datatools/s3/__init__.pyi +2 -2
  92. metaflow-stubs/plugins/datatools/s3/s3.pyi +27 -35
  93. metaflow-stubs/plugins/datatools/s3/s3tail.pyi +2 -2
  94. metaflow-stubs/plugins/datatools/s3/s3util.pyi +2 -2
  95. metaflow-stubs/plugins/debug_logger.pyi +2 -2
  96. metaflow-stubs/plugins/debug_monitor.pyi +2 -2
  97. metaflow-stubs/plugins/environment_decorator.pyi +2 -2
  98. metaflow-stubs/plugins/events_decorator.pyi +2 -2
  99. metaflow-stubs/plugins/exit_hook/__init__.pyi +2 -2
  100. metaflow-stubs/plugins/exit_hook/exit_hook_decorator.pyi +2 -2
  101. metaflow-stubs/plugins/frameworks/__init__.pyi +2 -2
  102. metaflow-stubs/plugins/frameworks/pytorch.pyi +2 -2
  103. metaflow-stubs/plugins/gcp/__init__.pyi +2 -2
  104. metaflow-stubs/plugins/gcp/gcp_secret_manager_secrets_provider.pyi +3 -3
  105. metaflow-stubs/plugins/gcp/gs_exceptions.pyi +2 -2
  106. metaflow-stubs/plugins/gcp/gs_storage_client_factory.pyi +2 -2
  107. metaflow-stubs/plugins/gcp/gs_utils.pyi +2 -2
  108. metaflow-stubs/plugins/gcp/includefile_support.pyi +2 -2
  109. metaflow-stubs/plugins/kubernetes/__init__.pyi +2 -2
  110. metaflow-stubs/plugins/kubernetes/kube_utils.pyi +3 -3
  111. metaflow-stubs/plugins/kubernetes/kubernetes.pyi +2 -2
  112. metaflow-stubs/plugins/kubernetes/kubernetes_client.pyi +2 -2
  113. metaflow-stubs/plugins/kubernetes/kubernetes_decorator.pyi +2 -2
  114. metaflow-stubs/plugins/kubernetes/kubernetes_job.pyi +2 -2
  115. metaflow-stubs/plugins/kubernetes/kubernetes_jobsets.pyi +2 -2
  116. metaflow-stubs/plugins/kubernetes/spot_monitor_sidecar.pyi +2 -2
  117. metaflow-stubs/plugins/parallel_decorator.pyi +2 -2
  118. metaflow-stubs/plugins/parsers.pyi +2 -2
  119. metaflow-stubs/plugins/project_decorator.pyi +2 -2
  120. metaflow-stubs/plugins/pypi/__init__.pyi +2 -2
  121. metaflow-stubs/plugins/pypi/conda_decorator.pyi +2 -2
  122. metaflow-stubs/plugins/pypi/conda_environment.pyi +3 -3
  123. metaflow-stubs/plugins/pypi/parsers.pyi +2 -2
  124. metaflow-stubs/plugins/pypi/pypi_decorator.pyi +2 -2
  125. metaflow-stubs/plugins/pypi/pypi_environment.pyi +2 -2
  126. metaflow-stubs/plugins/pypi/utils.pyi +2 -2
  127. metaflow-stubs/plugins/resources_decorator.pyi +2 -2
  128. metaflow-stubs/plugins/retry_decorator.pyi +2 -2
  129. metaflow-stubs/plugins/secrets/__init__.pyi +2 -2
  130. metaflow-stubs/plugins/secrets/inline_secrets_provider.pyi +3 -3
  131. metaflow-stubs/plugins/secrets/secrets_decorator.pyi +2 -2
  132. metaflow-stubs/plugins/secrets/secrets_func.pyi +3 -5
  133. metaflow-stubs/plugins/secrets/secrets_spec.pyi +2 -2
  134. metaflow-stubs/plugins/secrets/utils.pyi +2 -2
  135. metaflow-stubs/plugins/storage_executor.pyi +2 -2
  136. metaflow-stubs/plugins/test_unbounded_foreach_decorator.pyi +3 -3
  137. metaflow-stubs/plugins/timeout_decorator.pyi +3 -3
  138. metaflow-stubs/plugins/uv/__init__.pyi +2 -2
  139. metaflow-stubs/plugins/uv/uv_environment.pyi +3 -3
  140. metaflow-stubs/pylint_wrapper.pyi +2 -2
  141. metaflow-stubs/runner/__init__.pyi +2 -2
  142. metaflow-stubs/runner/deployer.pyi +20 -19
  143. metaflow-stubs/runner/deployer_impl.pyi +6 -4
  144. metaflow-stubs/runner/metaflow_runner.pyi +134 -21
  145. metaflow-stubs/runner/nbdeploy.pyi +3 -3
  146. metaflow-stubs/runner/nbrun.pyi +3 -3
  147. metaflow-stubs/runner/subprocess_manager.pyi +8 -8
  148. metaflow-stubs/runner/utils.pyi +4 -5
  149. metaflow-stubs/system/__init__.pyi +2 -2
  150. metaflow-stubs/system/system_logger.pyi +4 -5
  151. metaflow-stubs/system/system_monitor.pyi +3 -3
  152. metaflow-stubs/tagging_util.pyi +2 -2
  153. metaflow-stubs/tuple_util.pyi +2 -2
  154. metaflow-stubs/user_configs/__init__.pyi +2 -2
  155. metaflow-stubs/user_configs/config_options.pyi +5 -6
  156. metaflow-stubs/user_configs/config_parameters.pyi +9 -9
  157. metaflow-stubs/user_decorators/__init__.pyi +2 -2
  158. metaflow-stubs/user_decorators/common.pyi +6 -6
  159. metaflow-stubs/user_decorators/mutable_flow.pyi +6 -7
  160. metaflow-stubs/user_decorators/mutable_step.pyi +6 -9
  161. metaflow-stubs/user_decorators/user_flow_decorator.pyi +6 -7
  162. metaflow-stubs/user_decorators/user_step_decorator.pyi +22 -15
  163. metaflow-stubs/version.pyi +2 -2
  164. {metaflow_stubs-2.18.12.dist-info → metaflow_stubs-2.19.0.dist-info}/METADATA +2 -2
  165. metaflow_stubs-2.19.0.dist-info/RECORD +168 -0
  166. metaflow_stubs-2.18.12.dist-info/RECORD +0 -168
  167. {metaflow_stubs-2.18.12.dist-info → metaflow_stubs-2.19.0.dist-info}/WHEEL +0 -0
  168. {metaflow_stubs-2.18.12.dist-info → metaflow_stubs-2.19.0.dist-info}/top_level.txt +0 -0
@@ -1,7 +1,7 @@
1
1
  ######################################################################################################
2
2
  # Auto-generated Metaflow stub file #
3
- # MF version: 2.18.12 #
4
- # Generated on 2025-10-15T20:44:18.181602 #
3
+ # MF version: 2.19.0 #
4
+ # Generated on 2025-10-26T02:29:29.828022 #
5
5
  ######################################################################################################
6
6
 
7
7
  from __future__ import annotations
@@ -39,8 +39,8 @@ from .user_decorators.user_step_decorator import UserStepDecorator as UserStepDe
39
39
  from .user_decorators.user_step_decorator import StepMutator as StepMutator
40
40
  from .user_decorators.user_step_decorator import user_step_decorator as user_step_decorator
41
41
  from .user_decorators.user_flow_decorator import FlowMutator as FlowMutator
42
- from . import tuple_util as tuple_util
43
42
  from . import metaflow_git as metaflow_git
43
+ from . import tuple_util as tuple_util
44
44
  from . import events as events
45
45
  from . import runner as runner
46
46
  from . import plugins as plugins
@@ -48,9 +48,9 @@ from .plugins.datatools.s3.s3 import S3 as S3
48
48
  from . import includefile as includefile
49
49
  from .includefile import IncludeFile as IncludeFile
50
50
  from .plugins.parsers import yaml_parser as yaml_parser
51
- from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
52
51
  from .plugins.pypi.parsers import pyproject_toml_parser as pyproject_toml_parser
53
52
  from .plugins.pypi.parsers import conda_environment_yml_parser as conda_environment_yml_parser
53
+ from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
54
54
  from . import cards as cards
55
55
  from . import client as client
56
56
  from .client.core import namespace as namespace
@@ -59,6 +59,7 @@ from .client.core import default_namespace as default_namespace
59
59
  from .client.core import metadata as metadata
60
60
  from .client.core import get_metadata as get_metadata
61
61
  from .client.core import default_metadata as default_metadata
62
+ from .client.core import inspect_spin as inspect_spin
62
63
  from .client.core import Metaflow as Metaflow
63
64
  from .client.core import Flow as Flow
64
65
  from .client.core import Run as Run
@@ -71,8 +72,8 @@ from .runner.deployer import Deployer as Deployer
71
72
  from .runner.deployer import DeployedFlow as DeployedFlow
72
73
  from .runner.nbdeploy import NBDeployer as NBDeployer
73
74
  from . import version as version
74
- from . import system as system
75
75
  from . import cli_components as cli_components
76
+ from . import system as system
76
77
  from . import pylint_wrapper as pylint_wrapper
77
78
  from . import cli as cli
78
79
 
@@ -119,7 +120,7 @@ def step(f: typing.Callable[[FlowSpecDerived], None]) -> typing.Callable[[FlowSp
119
120
  def step(f: typing.Callable[[FlowSpecDerived, typing.Any], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
120
121
  ...
121
122
 
122
- def step(f: typing.Union[typing.Callable[[FlowSpecDerived], None], typing.Callable[[FlowSpecDerived, typing.Any], None]]):
123
+ def step(f: typing.Callable[[~FlowSpecDerived], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any], NoneType]):
123
124
  """
124
125
  Marks a method in a FlowSpec as a Metaflow Step. Note that this
125
126
  decorator needs to be placed as close to the method as possible (ie:
@@ -154,62 +155,26 @@ def step(f: typing.Union[typing.Callable[[FlowSpecDerived], None], typing.Callab
154
155
  ...
155
156
 
156
157
  @typing.overload
157
- def retry(*, times: int = 3, minutes_between_retries: int = 2) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
158
+ def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
158
159
  """
159
- Specifies the number of times the task corresponding
160
- to a step needs to be retried.
161
-
162
- This decorator is useful for handling transient errors, such as networking issues.
163
- If your task contains operations that can't be retried safely, e.g. database updates,
164
- it is advisable to annotate it with `@retry(times=0)`.
165
-
166
- This can be used in conjunction with the `@catch` decorator. The `@catch`
167
- decorator will execute a no-op task after all retries have been exhausted,
168
- ensuring that the flow execution can continue.
169
-
170
-
171
- Parameters
172
- ----------
173
- times : int, default 3
174
- Number of times to retry this task.
175
- minutes_between_retries : int, default 2
176
- Number of minutes between retries.
160
+ Decorator prototype for all step decorators. This function gets specialized
161
+ and imported for all decorators types by _import_plugin_decorators().
177
162
  """
178
163
  ...
179
164
 
180
165
  @typing.overload
181
- def retry(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
182
- ...
183
-
184
- @typing.overload
185
- def retry(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
166
+ def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
186
167
  ...
187
168
 
188
- def retry(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, times: int = 3, minutes_between_retries: int = 2):
169
+ def parallel(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None):
189
170
  """
190
- Specifies the number of times the task corresponding
191
- to a step needs to be retried.
192
-
193
- This decorator is useful for handling transient errors, such as networking issues.
194
- If your task contains operations that can't be retried safely, e.g. database updates,
195
- it is advisable to annotate it with `@retry(times=0)`.
196
-
197
- This can be used in conjunction with the `@catch` decorator. The `@catch`
198
- decorator will execute a no-op task after all retries have been exhausted,
199
- ensuring that the flow execution can continue.
200
-
201
-
202
- Parameters
203
- ----------
204
- times : int, default 3
205
- Number of times to retry this task.
206
- minutes_between_retries : int, default 2
207
- Number of minutes between retries.
171
+ Decorator prototype for all step decorators. This function gets specialized
172
+ and imported for all decorators types by _import_plugin_decorators().
208
173
  """
209
174
  ...
210
175
 
211
176
  @typing.overload
212
- def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
177
+ def catch(*, var: str | None = None, print_exception: bool = True) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
213
178
  """
214
179
  Specifies that the step will success under all circumstances.
215
180
 
@@ -238,7 +203,7 @@ def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Calla
238
203
  def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
239
204
  ...
240
205
 
241
- def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
206
+ def catch(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, var: str | None = None, print_exception: bool = True):
242
207
  """
243
208
  Specifies that the step will success under all circumstances.
244
209
 
@@ -260,321 +225,195 @@ def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], ty
260
225
  ...
261
226
 
262
227
  @typing.overload
263
- def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
228
+ def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: str | None = None, disabled: bool = False) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
264
229
  """
265
- Creates a human-readable report, a Metaflow Card, after this step completes.
230
+ Specifies the Conda environment for the step.
266
231
 
267
- Note that you may add multiple `@card` decorators in a step with different parameters.
232
+ Information in this decorator will augment any
233
+ attributes set in the `@conda_base` flow-level decorator. Hence,
234
+ you can use `@conda_base` to set packages required by all
235
+ steps and use `@conda` to specify step-specific overrides.
268
236
 
269
237
 
270
238
  Parameters
271
239
  ----------
272
- type : str, default 'default'
273
- Card type.
274
- id : str, optional, default None
275
- If multiple cards are present, use this id to identify this card.
276
- options : Dict[str, Any], default {}
277
- Options passed to the card. The contents depend on the card type.
278
- timeout : int, default 45
279
- Interrupt reporting if it takes more than this many seconds.
240
+ packages : Dict[str, str], default {}
241
+ Packages to use for this step. The key is the name of the package
242
+ and the value is the version to use.
243
+ libraries : Dict[str, str], default {}
244
+ Supported for backward compatibility. When used with packages, packages will take precedence.
245
+ python : str, optional, default None
246
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
247
+ that the version used will correspond to the version of the Python interpreter used to start the run.
248
+ disabled : bool, default False
249
+ If set to True, disables @conda.
280
250
  """
281
251
  ...
282
252
 
283
253
  @typing.overload
284
- def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
254
+ def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
285
255
  ...
286
256
 
287
257
  @typing.overload
288
- def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
258
+ def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
289
259
  ...
290
260
 
291
- def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
261
+ def conda(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: str | None = None, disabled: bool = False):
292
262
  """
293
- Creates a human-readable report, a Metaflow Card, after this step completes.
263
+ Specifies the Conda environment for the step.
294
264
 
295
- Note that you may add multiple `@card` decorators in a step with different parameters.
265
+ Information in this decorator will augment any
266
+ attributes set in the `@conda_base` flow-level decorator. Hence,
267
+ you can use `@conda_base` to set packages required by all
268
+ steps and use `@conda` to specify step-specific overrides.
296
269
 
297
270
 
298
271
  Parameters
299
272
  ----------
300
- type : str, default 'default'
301
- Card type.
302
- id : str, optional, default None
303
- If multiple cards are present, use this id to identify this card.
304
- options : Dict[str, Any], default {}
305
- Options passed to the card. The contents depend on the card type.
306
- timeout : int, default 45
307
- Interrupt reporting if it takes more than this many seconds.
273
+ packages : Dict[str, str], default {}
274
+ Packages to use for this step. The key is the name of the package
275
+ and the value is the version to use.
276
+ libraries : Dict[str, str], default {}
277
+ Supported for backward compatibility. When used with packages, packages will take precedence.
278
+ python : str, optional, default None
279
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
280
+ that the version used will correspond to the version of the Python interpreter used to start the run.
281
+ disabled : bool, default False
282
+ If set to True, disables @conda.
308
283
  """
309
284
  ...
310
285
 
311
286
  @typing.overload
312
- def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
287
+ def pypi(*, packages: typing.Dict[str, str] = {}, python: str | None = None) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
313
288
  """
314
- Specifies environment variables to be set prior to the execution of a step.
289
+ Specifies the PyPI packages for the step.
290
+
291
+ Information in this decorator will augment any
292
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
293
+ you can use `@pypi_base` to set packages required by all
294
+ steps and use `@pypi` to specify step-specific overrides.
315
295
 
316
296
 
317
297
  Parameters
318
298
  ----------
319
- vars : Dict[str, str], default {}
320
- Dictionary of environment variables to set.
299
+ packages : Dict[str, str], default: {}
300
+ Packages to use for this step. The key is the name of the package
301
+ and the value is the version to use.
302
+ python : str, optional, default: None
303
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
304
+ that the version used will correspond to the version of the Python interpreter used to start the run.
321
305
  """
322
306
  ...
323
307
 
324
308
  @typing.overload
325
- def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
309
+ def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
326
310
  ...
327
311
 
328
312
  @typing.overload
329
- def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
313
+ def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
330
314
  ...
331
315
 
332
- def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
316
+ def pypi(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, packages: typing.Dict[str, str] = {}, python: str | None = None):
333
317
  """
334
- Specifies environment variables to be set prior to the execution of a step.
318
+ Specifies the PyPI packages for the step.
319
+
320
+ Information in this decorator will augment any
321
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
322
+ you can use `@pypi_base` to set packages required by all
323
+ steps and use `@pypi` to specify step-specific overrides.
335
324
 
336
325
 
337
326
  Parameters
338
327
  ----------
339
- vars : Dict[str, str], default {}
340
- Dictionary of environment variables to set.
328
+ packages : Dict[str, str], default: {}
329
+ Packages to use for this step. The key is the name of the package
330
+ and the value is the version to use.
331
+ python : str, optional, default: None
332
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
333
+ that the version used will correspond to the version of the Python interpreter used to start the run.
341
334
  """
342
335
  ...
343
336
 
344
337
  @typing.overload
345
- def batch(*, cpu: int = 1, gpu: int = 0, memory: int = 4096, image: typing.Optional[str] = None, queue: str = 'METAFLOW_BATCH_JOB_QUEUE', iam_role: str = 'METAFLOW_ECS_S3_ACCESS_IAM_ROLE', execution_role: str = 'METAFLOW_ECS_FARGATE_EXECUTION_ROLE', shared_memory: typing.Optional[int] = None, max_swap: typing.Optional[int] = None, swappiness: typing.Optional[int] = None, aws_batch_tags: typing.Optional[typing.Dict[str, str]] = None, use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = None, inferentia: int = 0, trainium: int = None, efa: int = 0, ephemeral_storage: int = None, log_driver: typing.Optional[str] = None, log_options: typing.Optional[typing.List[str]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
338
+ def resources(*, cpu: int = 1, gpu: int | None = None, disk: int | None = None, memory: int = 4096, shared_memory: int | None = None) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
346
339
  """
347
- Specifies that this step should execute on [AWS Batch](https://aws.amazon.com/batch/).
340
+ Specifies the resources needed when executing this step.
341
+
342
+ Use `@resources` to specify the resource requirements
343
+ independently of the specific compute layer (`@batch`, `@kubernetes`).
344
+
345
+ You can choose the compute layer on the command line by executing e.g.
346
+ ```
347
+ python myflow.py run --with batch
348
+ ```
349
+ or
350
+ ```
351
+ python myflow.py run --with kubernetes
352
+ ```
353
+ which executes the flow on the desired system using the
354
+ requirements specified in `@resources`.
348
355
 
349
356
 
350
357
  Parameters
351
358
  ----------
352
359
  cpu : int, default 1
353
- Number of CPUs required for this step. If `@resources` is
354
- also present, the maximum value from all decorators is used.
355
- gpu : int, default 0
356
- Number of GPUs required for this step. If `@resources` is
357
- also present, the maximum value from all decorators is used.
360
+ Number of CPUs required for this step.
361
+ gpu : int, optional, default None
362
+ Number of GPUs required for this step.
363
+ disk : int, optional, default None
364
+ Disk size (in MB) required for this step. Only applies on Kubernetes.
358
365
  memory : int, default 4096
359
- Memory size (in MB) required for this step. If
360
- `@resources` is also present, the maximum value from all decorators is
361
- used.
362
- image : str, optional, default None
363
- Docker image to use when launching on AWS Batch. If not specified, and
364
- METAFLOW_BATCH_CONTAINER_IMAGE is specified, that image is used. If
365
- not, a default Docker image mapping to the current version of Python is used.
366
- queue : str, default METAFLOW_BATCH_JOB_QUEUE
367
- AWS Batch Job Queue to submit the job to.
368
- iam_role : str, default METAFLOW_ECS_S3_ACCESS_IAM_ROLE
369
- AWS IAM role that AWS Batch container uses to access AWS cloud resources.
370
- execution_role : str, default METAFLOW_ECS_FARGATE_EXECUTION_ROLE
371
- AWS IAM role that AWS Batch can use [to trigger AWS Fargate tasks]
372
- (https://docs.aws.amazon.com/batch/latest/userguide/execution-IAM-role.html).
366
+ Memory size (in MB) required for this step.
373
367
  shared_memory : int, optional, default None
374
368
  The value for the size (in MiB) of the /dev/shm volume for this step.
375
369
  This parameter maps to the `--shm-size` option in Docker.
376
- max_swap : int, optional, default None
377
- The total amount of swap memory (in MiB) a container can use for this
378
- step. This parameter is translated to the `--memory-swap` option in
379
- Docker where the value is the sum of the container memory plus the
380
- `max_swap` value.
381
- swappiness : int, optional, default None
382
- This allows you to tune memory swappiness behavior for this step.
383
- A swappiness value of 0 causes swapping not to happen unless absolutely
384
- necessary. A swappiness value of 100 causes pages to be swapped very
385
- aggressively. Accepted values are whole numbers between 0 and 100.
386
- aws_batch_tags: Dict[str, str], optional, default None
387
- Sets arbitrary AWS tags on the AWS Batch compute environment.
388
- Set as string key-value pairs.
389
- use_tmpfs : bool, default False
390
- This enables an explicit tmpfs mount for this step. Note that tmpfs is
391
- not available on Fargate compute environments
392
- tmpfs_tempdir : bool, default True
393
- sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
394
- tmpfs_size : int, optional, default None
395
- The value for the size (in MiB) of the tmpfs mount for this step.
396
- This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
397
- memory allocated for this step.
398
- tmpfs_path : str, optional, default None
399
- Path to tmpfs mount for this step. Defaults to /metaflow_temp.
400
- inferentia : int, default 0
401
- Number of Inferentia chips required for this step.
402
- trainium : int, default None
403
- Alias for inferentia. Use only one of the two.
404
- efa : int, default 0
405
- Number of elastic fabric adapter network devices to attach to container
406
- ephemeral_storage : int, default None
407
- The total amount, in GiB, of ephemeral storage to set for the task, 21-200GiB.
408
- This is only relevant for Fargate compute environments
409
- log_driver: str, optional, default None
410
- The log driver to use for the Amazon ECS container.
411
- log_options: List[str], optional, default None
412
- List of strings containing options for the chosen log driver. The configurable values
413
- depend on the `log driver` chosen. Validation of these options is not supported yet.
414
- Example: [`awslogs-group:aws/batch/job`]
415
370
  """
416
371
  ...
417
372
 
418
373
  @typing.overload
419
- def batch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
374
+ def resources(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
420
375
  ...
421
376
 
422
377
  @typing.overload
423
- def batch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
378
+ def resources(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
424
379
  ...
425
380
 
426
- def batch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, cpu: int = 1, gpu: int = 0, memory: int = 4096, image: typing.Optional[str] = None, queue: str = 'METAFLOW_BATCH_JOB_QUEUE', iam_role: str = 'METAFLOW_ECS_S3_ACCESS_IAM_ROLE', execution_role: str = 'METAFLOW_ECS_FARGATE_EXECUTION_ROLE', shared_memory: typing.Optional[int] = None, max_swap: typing.Optional[int] = None, swappiness: typing.Optional[int] = None, aws_batch_tags: typing.Optional[typing.Dict[str, str]] = None, use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = None, inferentia: int = 0, trainium: int = None, efa: int = 0, ephemeral_storage: int = None, log_driver: typing.Optional[str] = None, log_options: typing.Optional[typing.List[str]] = None):
381
+ def resources(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, cpu: int = 1, gpu: int | None = None, disk: int | None = None, memory: int = 4096, shared_memory: int | None = None):
427
382
  """
428
- Specifies that this step should execute on [AWS Batch](https://aws.amazon.com/batch/).
383
+ Specifies the resources needed when executing this step.
384
+
385
+ Use `@resources` to specify the resource requirements
386
+ independently of the specific compute layer (`@batch`, `@kubernetes`).
387
+
388
+ You can choose the compute layer on the command line by executing e.g.
389
+ ```
390
+ python myflow.py run --with batch
391
+ ```
392
+ or
393
+ ```
394
+ python myflow.py run --with kubernetes
395
+ ```
396
+ which executes the flow on the desired system using the
397
+ requirements specified in `@resources`.
429
398
 
430
399
 
431
400
  Parameters
432
401
  ----------
433
402
  cpu : int, default 1
434
- Number of CPUs required for this step. If `@resources` is
435
- also present, the maximum value from all decorators is used.
436
- gpu : int, default 0
437
- Number of GPUs required for this step. If `@resources` is
438
- also present, the maximum value from all decorators is used.
403
+ Number of CPUs required for this step.
404
+ gpu : int, optional, default None
405
+ Number of GPUs required for this step.
406
+ disk : int, optional, default None
407
+ Disk size (in MB) required for this step. Only applies on Kubernetes.
439
408
  memory : int, default 4096
440
- Memory size (in MB) required for this step. If
441
- `@resources` is also present, the maximum value from all decorators is
442
- used.
443
- image : str, optional, default None
444
- Docker image to use when launching on AWS Batch. If not specified, and
445
- METAFLOW_BATCH_CONTAINER_IMAGE is specified, that image is used. If
446
- not, a default Docker image mapping to the current version of Python is used.
447
- queue : str, default METAFLOW_BATCH_JOB_QUEUE
448
- AWS Batch Job Queue to submit the job to.
449
- iam_role : str, default METAFLOW_ECS_S3_ACCESS_IAM_ROLE
450
- AWS IAM role that AWS Batch container uses to access AWS cloud resources.
451
- execution_role : str, default METAFLOW_ECS_FARGATE_EXECUTION_ROLE
452
- AWS IAM role that AWS Batch can use [to trigger AWS Fargate tasks]
453
- (https://docs.aws.amazon.com/batch/latest/userguide/execution-IAM-role.html).
409
+ Memory size (in MB) required for this step.
454
410
  shared_memory : int, optional, default None
455
411
  The value for the size (in MiB) of the /dev/shm volume for this step.
456
412
  This parameter maps to the `--shm-size` option in Docker.
457
- max_swap : int, optional, default None
458
- The total amount of swap memory (in MiB) a container can use for this
459
- step. This parameter is translated to the `--memory-swap` option in
460
- Docker where the value is the sum of the container memory plus the
461
- `max_swap` value.
462
- swappiness : int, optional, default None
463
- This allows you to tune memory swappiness behavior for this step.
464
- A swappiness value of 0 causes swapping not to happen unless absolutely
465
- necessary. A swappiness value of 100 causes pages to be swapped very
466
- aggressively. Accepted values are whole numbers between 0 and 100.
467
- aws_batch_tags: Dict[str, str], optional, default None
468
- Sets arbitrary AWS tags on the AWS Batch compute environment.
469
- Set as string key-value pairs.
470
- use_tmpfs : bool, default False
471
- This enables an explicit tmpfs mount for this step. Note that tmpfs is
472
- not available on Fargate compute environments
473
- tmpfs_tempdir : bool, default True
474
- sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
475
- tmpfs_size : int, optional, default None
476
- The value for the size (in MiB) of the tmpfs mount for this step.
477
- This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
478
- memory allocated for this step.
479
- tmpfs_path : str, optional, default None
480
- Path to tmpfs mount for this step. Defaults to /metaflow_temp.
481
- inferentia : int, default 0
482
- Number of Inferentia chips required for this step.
483
- trainium : int, default None
484
- Alias for inferentia. Use only one of the two.
485
- efa : int, default 0
486
- Number of elastic fabric adapter network devices to attach to container
487
- ephemeral_storage : int, default None
488
- The total amount, in GiB, of ephemeral storage to set for the task, 21-200GiB.
489
- This is only relevant for Fargate compute environments
490
- log_driver: str, optional, default None
491
- The log driver to use for the Amazon ECS container.
492
- log_options: List[str], optional, default None
493
- List of strings containing options for the chosen log driver. The configurable values
494
- depend on the `log driver` chosen. Validation of these options is not supported yet.
495
- Example: [`awslogs-group:aws/batch/job`]
496
413
  """
497
414
  ...
498
415
 
499
- @typing.overload
500
- def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
501
- """
502
- Decorator prototype for all step decorators. This function gets specialized
503
- and imported for all decorators types by _import_plugin_decorators().
504
- """
505
- ...
506
-
507
- @typing.overload
508
- def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
509
- ...
510
-
511
- def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
512
- """
513
- Decorator prototype for all step decorators. This function gets specialized
514
- and imported for all decorators types by _import_plugin_decorators().
515
- """
516
- ...
517
-
518
- @typing.overload
519
- def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
520
- """
521
- Specifies the Conda environment for the step.
522
-
523
- Information in this decorator will augment any
524
- attributes set in the `@conda_base` flow-level decorator. Hence,
525
- you can use `@conda_base` to set packages required by all
526
- steps and use `@conda` to specify step-specific overrides.
527
-
528
-
529
- Parameters
530
- ----------
531
- packages : Dict[str, str], default {}
532
- Packages to use for this step. The key is the name of the package
533
- and the value is the version to use.
534
- libraries : Dict[str, str], default {}
535
- Supported for backward compatibility. When used with packages, packages will take precedence.
536
- python : str, optional, default None
537
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
538
- that the version used will correspond to the version of the Python interpreter used to start the run.
539
- disabled : bool, default False
540
- If set to True, disables @conda.
541
- """
542
- ...
543
-
544
- @typing.overload
545
- def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
546
- ...
547
-
548
- @typing.overload
549
- def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
550
- ...
551
-
552
- def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
553
- """
554
- Specifies the Conda environment for the step.
555
-
556
- Information in this decorator will augment any
557
- attributes set in the `@conda_base` flow-level decorator. Hence,
558
- you can use `@conda_base` to set packages required by all
559
- steps and use `@conda` to specify step-specific overrides.
560
-
561
-
562
- Parameters
563
- ----------
564
- packages : Dict[str, str], default {}
565
- Packages to use for this step. The key is the name of the package
566
- and the value is the version to use.
567
- libraries : Dict[str, str], default {}
568
- Supported for backward compatibility. When used with packages, packages will take precedence.
569
- python : str, optional, default None
570
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
571
- that the version used will correspond to the version of the Python interpreter used to start the run.
572
- disabled : bool, default False
573
- If set to True, disables @conda.
574
- """
575
- ...
576
-
577
- def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: typing.Optional[str] = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.Optional[typing.List[str]] = None, node_selector: typing.Union[typing.Dict[str, str], str, None] = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: typing.Optional[int] = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = '/metaflow_temp', persistent_volume_claims: typing.Optional[typing.Dict[str, str]] = None, shared_memory: typing.Optional[int] = None, port: typing.Optional[int] = None, compute_pool: typing.Optional[str] = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Optional[typing.Dict[str, typing.Any]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
416
+ def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: str | None = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.List[str] | None = None, node_selector: typing.Dict[str, str] | str | None = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: int | None = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: int | None = None, tmpfs_path: str | None = '/metaflow_temp', persistent_volume_claims: typing.Dict[str, str] | None = None, shared_memory: int | None = None, port: int | None = None, compute_pool: str | None = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Dict[str, typing.Any] | None = None) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
578
417
  """
579
418
  Specifies that this step should execute on Kubernetes.
580
419
 
@@ -664,86 +503,40 @@ def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: ty
664
503
  ...
665
504
 
666
505
  @typing.overload
667
- def resources(*, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
506
+ def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
668
507
  """
669
- Specifies the resources needed when executing this step.
670
-
671
- Use `@resources` to specify the resource requirements
672
- independently of the specific compute layer (`@batch`, `@kubernetes`).
673
-
674
- You can choose the compute layer on the command line by executing e.g.
675
- ```
676
- python myflow.py run --with batch
677
- ```
678
- or
679
- ```
680
- python myflow.py run --with kubernetes
681
- ```
682
- which executes the flow on the desired system using the
683
- requirements specified in `@resources`.
508
+ Specifies environment variables to be set prior to the execution of a step.
684
509
 
685
510
 
686
511
  Parameters
687
512
  ----------
688
- cpu : int, default 1
689
- Number of CPUs required for this step.
690
- gpu : int, optional, default None
691
- Number of GPUs required for this step.
692
- disk : int, optional, default None
693
- Disk size (in MB) required for this step. Only applies on Kubernetes.
694
- memory : int, default 4096
695
- Memory size (in MB) required for this step.
696
- shared_memory : int, optional, default None
697
- The value for the size (in MiB) of the /dev/shm volume for this step.
698
- This parameter maps to the `--shm-size` option in Docker.
513
+ vars : Dict[str, str], default {}
514
+ Dictionary of environment variables to set.
699
515
  """
700
516
  ...
701
517
 
702
518
  @typing.overload
703
- def resources(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
519
+ def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
704
520
  ...
705
521
 
706
522
  @typing.overload
707
- def resources(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
523
+ def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
708
524
  ...
709
525
 
710
- def resources(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None):
526
+ def environment(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, vars: typing.Dict[str, str] = {}):
711
527
  """
712
- Specifies the resources needed when executing this step.
713
-
714
- Use `@resources` to specify the resource requirements
715
- independently of the specific compute layer (`@batch`, `@kubernetes`).
716
-
717
- You can choose the compute layer on the command line by executing e.g.
718
- ```
719
- python myflow.py run --with batch
720
- ```
721
- or
722
- ```
723
- python myflow.py run --with kubernetes
724
- ```
725
- which executes the flow on the desired system using the
726
- requirements specified in `@resources`.
528
+ Specifies environment variables to be set prior to the execution of a step.
727
529
 
728
530
 
729
531
  Parameters
730
532
  ----------
731
- cpu : int, default 1
732
- Number of CPUs required for this step.
733
- gpu : int, optional, default None
734
- Number of GPUs required for this step.
735
- disk : int, optional, default None
736
- Disk size (in MB) required for this step. Only applies on Kubernetes.
737
- memory : int, default 4096
738
- Memory size (in MB) required for this step.
739
- shared_memory : int, optional, default None
740
- The value for the size (in MiB) of the /dev/shm volume for this step.
741
- This parameter maps to the `--shm-size` option in Docker.
533
+ vars : Dict[str, str], default {}
534
+ Dictionary of environment variables to set.
742
535
  """
743
536
  ...
744
537
 
745
538
  @typing.overload
746
- def timeout(*, seconds: int = 0, minutes: int = 0, hours: int = 0) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
539
+ def timeout(*, seconds: int = 0, minutes: int = 0, hours: int = 0) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
747
540
  """
748
541
  Specifies a timeout for your step.
749
542
 
@@ -776,7 +569,7 @@ def timeout(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Cal
776
569
  def timeout(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
777
570
  ...
778
571
 
779
- def timeout(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, seconds: int = 0, minutes: int = 0, hours: int = 0):
572
+ def timeout(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, seconds: int = 0, minutes: int = 0, hours: int = 0):
780
573
  """
781
574
  Specifies a timeout for your step.
782
575
 
@@ -802,58 +595,266 @@ def timeout(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None],
802
595
  ...
803
596
 
804
597
  @typing.overload
805
- def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
598
+ def batch(*, cpu: int = 1, gpu: int = 0, memory: int = 4096, image: str | None = None, queue: str = 'METAFLOW_BATCH_JOB_QUEUE', iam_role: str = 'METAFLOW_ECS_S3_ACCESS_IAM_ROLE', execution_role: str = 'METAFLOW_ECS_FARGATE_EXECUTION_ROLE', shared_memory: int | None = None, max_swap: int | None = None, swappiness: int | None = None, aws_batch_tags: typing.Dict[str, str] | None = None, use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: int | None = None, tmpfs_path: str | None = None, inferentia: int = 0, trainium: int = None, efa: int = 0, ephemeral_storage: int = None, log_driver: str | None = None, log_options: typing.List[str] | None = None) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
806
599
  """
807
- Specifies the PyPI packages for the step.
600
+ Specifies that this step should execute on [AWS Batch](https://aws.amazon.com/batch/).
808
601
 
809
- Information in this decorator will augment any
810
- attributes set in the `@pyi_base` flow-level decorator. Hence,
811
- you can use `@pypi_base` to set packages required by all
812
- steps and use `@pypi` to specify step-specific overrides.
602
+
603
+ Parameters
604
+ ----------
605
+ cpu : int, default 1
606
+ Number of CPUs required for this step. If `@resources` is
607
+ also present, the maximum value from all decorators is used.
608
+ gpu : int, default 0
609
+ Number of GPUs required for this step. If `@resources` is
610
+ also present, the maximum value from all decorators is used.
611
+ memory : int, default 4096
612
+ Memory size (in MB) required for this step. If
613
+ `@resources` is also present, the maximum value from all decorators is
614
+ used.
615
+ image : str, optional, default None
616
+ Docker image to use when launching on AWS Batch. If not specified, and
617
+ METAFLOW_BATCH_CONTAINER_IMAGE is specified, that image is used. If
618
+ not, a default Docker image mapping to the current version of Python is used.
619
+ queue : str, default METAFLOW_BATCH_JOB_QUEUE
620
+ AWS Batch Job Queue to submit the job to.
621
+ iam_role : str, default METAFLOW_ECS_S3_ACCESS_IAM_ROLE
622
+ AWS IAM role that AWS Batch container uses to access AWS cloud resources.
623
+ execution_role : str, default METAFLOW_ECS_FARGATE_EXECUTION_ROLE
624
+ AWS IAM role that AWS Batch can use [to trigger AWS Fargate tasks]
625
+ (https://docs.aws.amazon.com/batch/latest/userguide/execution-IAM-role.html).
626
+ shared_memory : int, optional, default None
627
+ The value for the size (in MiB) of the /dev/shm volume for this step.
628
+ This parameter maps to the `--shm-size` option in Docker.
629
+ max_swap : int, optional, default None
630
+ The total amount of swap memory (in MiB) a container can use for this
631
+ step. This parameter is translated to the `--memory-swap` option in
632
+ Docker where the value is the sum of the container memory plus the
633
+ `max_swap` value.
634
+ swappiness : int, optional, default None
635
+ This allows you to tune memory swappiness behavior for this step.
636
+ A swappiness value of 0 causes swapping not to happen unless absolutely
637
+ necessary. A swappiness value of 100 causes pages to be swapped very
638
+ aggressively. Accepted values are whole numbers between 0 and 100.
639
+ aws_batch_tags: Dict[str, str], optional, default None
640
+ Sets arbitrary AWS tags on the AWS Batch compute environment.
641
+ Set as string key-value pairs.
642
+ use_tmpfs : bool, default False
643
+ This enables an explicit tmpfs mount for this step. Note that tmpfs is
644
+ not available on Fargate compute environments
645
+ tmpfs_tempdir : bool, default True
646
+ sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
647
+ tmpfs_size : int, optional, default None
648
+ The value for the size (in MiB) of the tmpfs mount for this step.
649
+ This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
650
+ memory allocated for this step.
651
+ tmpfs_path : str, optional, default None
652
+ Path to tmpfs mount for this step. Defaults to /metaflow_temp.
653
+ inferentia : int, default 0
654
+ Number of Inferentia chips required for this step.
655
+ trainium : int, default None
656
+ Alias for inferentia. Use only one of the two.
657
+ efa : int, default 0
658
+ Number of elastic fabric adapter network devices to attach to container
659
+ ephemeral_storage : int, default None
660
+ The total amount, in GiB, of ephemeral storage to set for the task, 21-200GiB.
661
+ This is only relevant for Fargate compute environments
662
+ log_driver: str, optional, default None
663
+ The log driver to use for the Amazon ECS container.
664
+ log_options: List[str], optional, default None
665
+ List of strings containing options for the chosen log driver. The configurable values
666
+ depend on the `log driver` chosen. Validation of these options is not supported yet.
667
+ Example: [`awslogs-group:aws/batch/job`]
668
+ """
669
+ ...
670
+
671
+ @typing.overload
672
+ def batch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
673
+ ...
674
+
675
+ @typing.overload
676
+ def batch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
677
+ ...
678
+
679
+ def batch(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, cpu: int = 1, gpu: int = 0, memory: int = 4096, image: str | None = None, queue: str = 'METAFLOW_BATCH_JOB_QUEUE', iam_role: str = 'METAFLOW_ECS_S3_ACCESS_IAM_ROLE', execution_role: str = 'METAFLOW_ECS_FARGATE_EXECUTION_ROLE', shared_memory: int | None = None, max_swap: int | None = None, swappiness: int | None = None, aws_batch_tags: typing.Dict[str, str] | None = None, use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: int | None = None, tmpfs_path: str | None = None, inferentia: int = 0, trainium: int = None, efa: int = 0, ephemeral_storage: int = None, log_driver: str | None = None, log_options: typing.List[str] | None = None):
680
+ """
681
+ Specifies that this step should execute on [AWS Batch](https://aws.amazon.com/batch/).
813
682
 
814
683
 
815
684
  Parameters
816
685
  ----------
817
- packages : Dict[str, str], default: {}
818
- Packages to use for this step. The key is the name of the package
819
- and the value is the version to use.
820
- python : str, optional, default: None
821
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
822
- that the version used will correspond to the version of the Python interpreter used to start the run.
686
+ cpu : int, default 1
687
+ Number of CPUs required for this step. If `@resources` is
688
+ also present, the maximum value from all decorators is used.
689
+ gpu : int, default 0
690
+ Number of GPUs required for this step. If `@resources` is
691
+ also present, the maximum value from all decorators is used.
692
+ memory : int, default 4096
693
+ Memory size (in MB) required for this step. If
694
+ `@resources` is also present, the maximum value from all decorators is
695
+ used.
696
+ image : str, optional, default None
697
+ Docker image to use when launching on AWS Batch. If not specified, and
698
+ METAFLOW_BATCH_CONTAINER_IMAGE is specified, that image is used. If
699
+ not, a default Docker image mapping to the current version of Python is used.
700
+ queue : str, default METAFLOW_BATCH_JOB_QUEUE
701
+ AWS Batch Job Queue to submit the job to.
702
+ iam_role : str, default METAFLOW_ECS_S3_ACCESS_IAM_ROLE
703
+ AWS IAM role that AWS Batch container uses to access AWS cloud resources.
704
+ execution_role : str, default METAFLOW_ECS_FARGATE_EXECUTION_ROLE
705
+ AWS IAM role that AWS Batch can use [to trigger AWS Fargate tasks]
706
+ (https://docs.aws.amazon.com/batch/latest/userguide/execution-IAM-role.html).
707
+ shared_memory : int, optional, default None
708
+ The value for the size (in MiB) of the /dev/shm volume for this step.
709
+ This parameter maps to the `--shm-size` option in Docker.
710
+ max_swap : int, optional, default None
711
+ The total amount of swap memory (in MiB) a container can use for this
712
+ step. This parameter is translated to the `--memory-swap` option in
713
+ Docker where the value is the sum of the container memory plus the
714
+ `max_swap` value.
715
+ swappiness : int, optional, default None
716
+ This allows you to tune memory swappiness behavior for this step.
717
+ A swappiness value of 0 causes swapping not to happen unless absolutely
718
+ necessary. A swappiness value of 100 causes pages to be swapped very
719
+ aggressively. Accepted values are whole numbers between 0 and 100.
720
+ aws_batch_tags: Dict[str, str], optional, default None
721
+ Sets arbitrary AWS tags on the AWS Batch compute environment.
722
+ Set as string key-value pairs.
723
+ use_tmpfs : bool, default False
724
+ This enables an explicit tmpfs mount for this step. Note that tmpfs is
725
+ not available on Fargate compute environments
726
+ tmpfs_tempdir : bool, default True
727
+ sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
728
+ tmpfs_size : int, optional, default None
729
+ The value for the size (in MiB) of the tmpfs mount for this step.
730
+ This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
731
+ memory allocated for this step.
732
+ tmpfs_path : str, optional, default None
733
+ Path to tmpfs mount for this step. Defaults to /metaflow_temp.
734
+ inferentia : int, default 0
735
+ Number of Inferentia chips required for this step.
736
+ trainium : int, default None
737
+ Alias for inferentia. Use only one of the two.
738
+ efa : int, default 0
739
+ Number of elastic fabric adapter network devices to attach to container
740
+ ephemeral_storage : int, default None
741
+ The total amount, in GiB, of ephemeral storage to set for the task, 21-200GiB.
742
+ This is only relevant for Fargate compute environments
743
+ log_driver: str, optional, default None
744
+ The log driver to use for the Amazon ECS container.
745
+ log_options: List[str], optional, default None
746
+ List of strings containing options for the chosen log driver. The configurable values
747
+ depend on the `log driver` chosen. Validation of these options is not supported yet.
748
+ Example: [`awslogs-group:aws/batch/job`]
823
749
  """
824
750
  ...
825
751
 
826
752
  @typing.overload
827
- def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
753
+ def retry(*, times: int = 3, minutes_between_retries: int = 2) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
754
+ """
755
+ Specifies the number of times the task corresponding
756
+ to a step needs to be retried.
757
+
758
+ This decorator is useful for handling transient errors, such as networking issues.
759
+ If your task contains operations that can't be retried safely, e.g. database updates,
760
+ it is advisable to annotate it with `@retry(times=0)`.
761
+
762
+ This can be used in conjunction with the `@catch` decorator. The `@catch`
763
+ decorator will execute a no-op task after all retries have been exhausted,
764
+ ensuring that the flow execution can continue.
765
+
766
+
767
+ Parameters
768
+ ----------
769
+ times : int, default 3
770
+ Number of times to retry this task.
771
+ minutes_between_retries : int, default 2
772
+ Number of minutes between retries.
773
+ """
828
774
  ...
829
775
 
830
776
  @typing.overload
831
- def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
777
+ def retry(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
778
+ ...
779
+
780
+ @typing.overload
781
+ def retry(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
832
782
  ...
833
783
 
834
- def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
784
+ def retry(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, times: int = 3, minutes_between_retries: int = 2):
835
785
  """
836
- Specifies the PyPI packages for the step.
786
+ Specifies the number of times the task corresponding
787
+ to a step needs to be retried.
788
+
789
+ This decorator is useful for handling transient errors, such as networking issues.
790
+ If your task contains operations that can't be retried safely, e.g. database updates,
791
+ it is advisable to annotate it with `@retry(times=0)`.
792
+
793
+ This can be used in conjunction with the `@catch` decorator. The `@catch`
794
+ decorator will execute a no-op task after all retries have been exhausted,
795
+ ensuring that the flow execution can continue.
796
+
797
+
798
+ Parameters
799
+ ----------
800
+ times : int, default 3
801
+ Number of times to retry this task.
802
+ minutes_between_retries : int, default 2
803
+ Number of minutes between retries.
804
+ """
805
+ ...
806
+
807
+ @typing.overload
808
+ def card(*, type: str = 'default', id: str | None = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
809
+ """
810
+ Creates a human-readable report, a Metaflow Card, after this step completes.
811
+
812
+ Note that you may add multiple `@card` decorators in a step with different parameters.
813
+
814
+
815
+ Parameters
816
+ ----------
817
+ type : str, default 'default'
818
+ Card type.
819
+ id : str, optional, default None
820
+ If multiple cards are present, use this id to identify this card.
821
+ options : Dict[str, Any], default {}
822
+ Options passed to the card. The contents depend on the card type.
823
+ timeout : int, default 45
824
+ Interrupt reporting if it takes more than this many seconds.
825
+ """
826
+ ...
827
+
828
+ @typing.overload
829
+ def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
830
+ ...
831
+
832
+ @typing.overload
833
+ def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
834
+ ...
835
+
836
+ def card(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, type: str = 'default', id: str | None = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
837
+ """
838
+ Creates a human-readable report, a Metaflow Card, after this step completes.
837
839
 
838
- Information in this decorator will augment any
839
- attributes set in the `@pyi_base` flow-level decorator. Hence,
840
- you can use `@pypi_base` to set packages required by all
841
- steps and use `@pypi` to specify step-specific overrides.
840
+ Note that you may add multiple `@card` decorators in a step with different parameters.
842
841
 
843
842
 
844
843
  Parameters
845
844
  ----------
846
- packages : Dict[str, str], default: {}
847
- Packages to use for this step. The key is the name of the package
848
- and the value is the version to use.
849
- python : str, optional, default: None
850
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
851
- that the version used will correspond to the version of the Python interpreter used to start the run.
845
+ type : str, default 'default'
846
+ Card type.
847
+ id : str, optional, default None
848
+ If multiple cards are present, use this id to identify this card.
849
+ options : Dict[str, Any], default {}
850
+ Options passed to the card. The contents depend on the card type.
851
+ timeout : int, default 45
852
+ Interrupt reporting if it takes more than this many seconds.
852
853
  """
853
854
  ...
854
855
 
855
856
  @typing.overload
856
- def secrets(*, sources: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], role: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
857
+ def secrets(*, sources: typing.List[str | typing.Dict[str, typing.Any]] = [], role: str | None = None) -> typing.Callable[[typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]], typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType]]:
857
858
  """
858
859
  Specifies secrets to be retrieved and injected as environment variables prior to
859
860
  the execution of a step.
@@ -876,7 +877,7 @@ def secrets(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Cal
876
877
  def secrets(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
877
878
  ...
878
879
 
879
- def secrets(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, sources: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], role: typing.Optional[str] = None):
880
+ def secrets(f: typing.Callable[[~FlowSpecDerived, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | typing.Callable[[~FlowSpecDerived, typing.Any, metaflow.cmd.develop.stub_generator.StepFlag], NoneType] | None = None, *, sources: typing.List[str | typing.Dict[str, typing.Any]] = [], role: str | None = None):
880
881
  """
881
882
  Specifies secrets to be retrieved and injected as environment variables prior to
882
883
  the execution of a step.
@@ -935,48 +936,101 @@ def airflow_external_task_sensor(*, timeout: int, poke_interval: int, mode: str,
935
936
  ...
936
937
 
937
938
  @typing.overload
938
- def pypi_base(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
939
+ def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: str | None = None, timezone: str | None = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
939
940
  """
940
- Specifies the PyPI packages for all steps of the flow.
941
+ Specifies the times when the flow should be run when running on a
942
+ production scheduler.
941
943
 
942
- Use `@pypi_base` to set common packages required by all
943
- steps and use `@pypi` to specify step-specific overrides.
944
944
 
945
945
  Parameters
946
946
  ----------
947
- packages : Dict[str, str], default: {}
948
- Packages to use for this flow. The key is the name of the package
949
- and the value is the version to use.
950
- python : str, optional, default: None
951
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
952
- that the version used will correspond to the version of the Python interpreter used to start the run.
947
+ hourly : bool, default False
948
+ Run the workflow hourly.
949
+ daily : bool, default True
950
+ Run the workflow daily.
951
+ weekly : bool, default False
952
+ Run the workflow weekly.
953
+ cron : str, optional, default None
954
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
955
+ specified by this expression.
956
+ timezone : str, optional, default None
957
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
958
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
953
959
  """
954
960
  ...
955
961
 
956
962
  @typing.overload
957
- def pypi_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
963
+ def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
958
964
  ...
959
965
 
960
- def pypi_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
966
+ def schedule(f: typing.Type[~FlowSpecDerived] | None = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: str | None = None, timezone: str | None = None):
961
967
  """
962
- Specifies the PyPI packages for all steps of the flow.
968
+ Specifies the times when the flow should be run when running on a
969
+ production scheduler.
963
970
 
964
- Use `@pypi_base` to set common packages required by all
965
- steps and use `@pypi` to specify step-specific overrides.
966
971
 
967
972
  Parameters
968
973
  ----------
969
- packages : Dict[str, str], default: {}
970
- Packages to use for this flow. The key is the name of the package
971
- and the value is the version to use.
972
- python : str, optional, default: None
973
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
974
- that the version used will correspond to the version of the Python interpreter used to start the run.
974
+ hourly : bool, default False
975
+ Run the workflow hourly.
976
+ daily : bool, default True
977
+ Run the workflow daily.
978
+ weekly : bool, default False
979
+ Run the workflow weekly.
980
+ cron : str, optional, default None
981
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
982
+ specified by this expression.
983
+ timezone : str, optional, default None
984
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
985
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
986
+ """
987
+ ...
988
+
989
+ def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: str | typing.List[str], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
990
+ """
991
+ The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
992
+ before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
993
+ and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
994
+ added as a flow decorators. Adding more than one decorator will ensure that `start` step
995
+ starts only after all sensors finish.
996
+
997
+
998
+ Parameters
999
+ ----------
1000
+ timeout : int
1001
+ Time, in seconds before the task times out and fails. (Default: 3600)
1002
+ poke_interval : int
1003
+ Time in seconds that the job should wait in between each try. (Default: 60)
1004
+ mode : str
1005
+ How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1006
+ exponential_backoff : bool
1007
+ allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1008
+ pool : str
1009
+ the slot pool this task should run in,
1010
+ slot pools are a way to limit concurrency for certain tasks. (Default:None)
1011
+ soft_fail : bool
1012
+ Set to true to mark the task as SKIPPED on failure. (Default: False)
1013
+ name : str
1014
+ Name of the sensor on Airflow
1015
+ description : str
1016
+ Description of sensor in the Airflow UI
1017
+ bucket_key : Union[str, List[str]]
1018
+ The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1019
+ When it's specified as a full s3:// url, please leave `bucket_name` as None
1020
+ bucket_name : str
1021
+ Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1022
+ When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1023
+ wildcard_match : bool
1024
+ whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1025
+ aws_conn_id : str
1026
+ a reference to the s3 connection on Airflow. (Default: None)
1027
+ verify : bool
1028
+ Whether or not to verify SSL certificates for S3 connection. (Default: None)
975
1029
  """
976
1030
  ...
977
1031
 
978
1032
  @typing.overload
979
- def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1033
+ def trigger_on_finish(*, flow: typing.Dict[str, str] | str | None = None, flows: typing.List[str | typing.Dict[str, str]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
980
1034
  """
981
1035
  Specifies the flow(s) that this flow depends on.
982
1036
 
@@ -1028,7 +1082,7 @@ def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] =
1028
1082
  def trigger_on_finish(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1029
1083
  ...
1030
1084
 
1031
- def trigger_on_finish(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}):
1085
+ def trigger_on_finish(f: typing.Type[~FlowSpecDerived] | None = None, *, flow: typing.Dict[str, str] | str | None = None, flows: typing.List[str | typing.Dict[str, str]] = [], options: typing.Dict[str, typing.Any] = {}):
1032
1086
  """
1033
1087
  Specifies the flow(s) that this flow depends on.
1034
1088
 
@@ -1076,59 +1130,43 @@ def trigger_on_finish(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *
1076
1130
  """
1077
1131
  ...
1078
1132
 
1079
- @typing.overload
1080
- def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1133
+ def project(*, name: str, branch: str | None = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1081
1134
  """
1082
- Specifies the times when the flow should be run when running on a
1083
- production scheduler.
1135
+ Specifies what flows belong to the same project.
1136
+
1137
+ A project-specific namespace is created for all flows that
1138
+ use the same `@project(name)`.
1084
1139
 
1085
1140
 
1086
1141
  Parameters
1087
1142
  ----------
1088
- hourly : bool, default False
1089
- Run the workflow hourly.
1090
- daily : bool, default True
1091
- Run the workflow daily.
1092
- weekly : bool, default False
1093
- Run the workflow weekly.
1094
- cron : str, optional, default None
1095
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1096
- specified by this expression.
1097
- timezone : str, optional, default None
1098
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1099
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1100
- """
1101
- ...
1102
-
1103
- @typing.overload
1104
- def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1105
- ...
1106
-
1107
- def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1108
- """
1109
- Specifies the times when the flow should be run when running on a
1110
- production scheduler.
1143
+ name : str
1144
+ Project name. Make sure that the name is unique amongst all
1145
+ projects that use the same production scheduler. The name may
1146
+ contain only lowercase alphanumeric characters and underscores.
1111
1147
 
1148
+ branch : Optional[str], default None
1149
+ The branch to use. If not specified, the branch is set to
1150
+ `user.<username>` unless `production` is set to `True`. This can
1151
+ also be set on the command line using `--branch` as a top-level option.
1152
+ It is an error to specify `branch` in the decorator and on the command line.
1112
1153
 
1113
- Parameters
1114
- ----------
1115
- hourly : bool, default False
1116
- Run the workflow hourly.
1117
- daily : bool, default True
1118
- Run the workflow daily.
1119
- weekly : bool, default False
1120
- Run the workflow weekly.
1121
- cron : str, optional, default None
1122
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1123
- specified by this expression.
1124
- timezone : str, optional, default None
1125
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1126
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1154
+ production : bool, default False
1155
+ Whether or not the branch is the production branch. This can also be set on the
1156
+ command line using `--production` as a top-level option. It is an error to specify
1157
+ `production` in the decorator and on the command line.
1158
+ The project branch name will be:
1159
+ - if `branch` is specified:
1160
+ - if `production` is True: `prod.<branch>`
1161
+ - if `production` is False: `test.<branch>`
1162
+ - if `branch` is not specified:
1163
+ - if `production` is True: `prod`
1164
+ - if `production` is False: `user.<username>`
1127
1165
  """
1128
1166
  ...
1129
1167
 
1130
1168
  @typing.overload
1131
- def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1169
+ def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: str | None = None, disabled: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1132
1170
  """
1133
1171
  Specifies the Conda environment for all steps of the flow.
1134
1172
 
@@ -1155,7 +1193,7 @@ def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[s
1155
1193
  def conda_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1156
1194
  ...
1157
1195
 
1158
- def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1196
+ def conda_base(f: typing.Type[~FlowSpecDerived] | None = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: str | None = None, disabled: bool = False):
1159
1197
  """
1160
1198
  Specifies the Conda environment for all steps of the flow.
1161
1199
 
@@ -1179,7 +1217,7 @@ def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packa
1179
1217
  ...
1180
1218
 
1181
1219
  @typing.overload
1182
- def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1220
+ def trigger(*, event: str | typing.Dict[str, typing.Any] | None = None, events: typing.List[str | typing.Dict[str, typing.Any]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1183
1221
  """
1184
1222
  Specifies the event(s) that this flow depends on.
1185
1223
 
@@ -1227,7 +1265,7 @@ def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = No
1227
1265
  def trigger(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1228
1266
  ...
1229
1267
 
1230
- def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}):
1268
+ def trigger(f: typing.Type[~FlowSpecDerived] | None = None, *, event: str | typing.Dict[str, typing.Any] | None = None, events: typing.List[str | typing.Dict[str, typing.Any]] = [], options: typing.Dict[str, typing.Any] = {}):
1231
1269
  """
1232
1270
  Specifies the event(s) that this flow depends on.
1233
1271
 
@@ -1271,81 +1309,44 @@ def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: t
1271
1309
  """
1272
1310
  ...
1273
1311
 
1274
- def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1312
+ @typing.overload
1313
+ def pypi_base(*, packages: typing.Dict[str, str] = {}, python: str | None = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1275
1314
  """
1276
- The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1277
- before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1278
- and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1279
- added as a flow decorators. Adding more than one decorator will ensure that `start` step
1280
- starts only after all sensors finish.
1315
+ Specifies the PyPI packages for all steps of the flow.
1281
1316
 
1317
+ Use `@pypi_base` to set common packages required by all
1318
+ steps and use `@pypi` to specify step-specific overrides.
1282
1319
 
1283
1320
  Parameters
1284
1321
  ----------
1285
- timeout : int
1286
- Time, in seconds before the task times out and fails. (Default: 3600)
1287
- poke_interval : int
1288
- Time in seconds that the job should wait in between each try. (Default: 60)
1289
- mode : str
1290
- How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1291
- exponential_backoff : bool
1292
- allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1293
- pool : str
1294
- the slot pool this task should run in,
1295
- slot pools are a way to limit concurrency for certain tasks. (Default:None)
1296
- soft_fail : bool
1297
- Set to true to mark the task as SKIPPED on failure. (Default: False)
1298
- name : str
1299
- Name of the sensor on Airflow
1300
- description : str
1301
- Description of sensor in the Airflow UI
1302
- bucket_key : Union[str, List[str]]
1303
- The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1304
- When it's specified as a full s3:// url, please leave `bucket_name` as None
1305
- bucket_name : str
1306
- Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1307
- When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1308
- wildcard_match : bool
1309
- whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1310
- aws_conn_id : str
1311
- a reference to the s3 connection on Airflow. (Default: None)
1312
- verify : bool
1313
- Whether or not to verify SSL certificates for S3 connection. (Default: None)
1322
+ packages : Dict[str, str], default: {}
1323
+ Packages to use for this flow. The key is the name of the package
1324
+ and the value is the version to use.
1325
+ python : str, optional, default: None
1326
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1327
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1314
1328
  """
1315
1329
  ...
1316
1330
 
1317
- def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1331
+ @typing.overload
1332
+ def pypi_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1333
+ ...
1334
+
1335
+ def pypi_base(f: typing.Type[~FlowSpecDerived] | None = None, *, packages: typing.Dict[str, str] = {}, python: str | None = None):
1318
1336
  """
1319
- Specifies what flows belong to the same project.
1320
-
1321
- A project-specific namespace is created for all flows that
1322
- use the same `@project(name)`.
1337
+ Specifies the PyPI packages for all steps of the flow.
1323
1338
 
1339
+ Use `@pypi_base` to set common packages required by all
1340
+ steps and use `@pypi` to specify step-specific overrides.
1324
1341
 
1325
1342
  Parameters
1326
1343
  ----------
1327
- name : str
1328
- Project name. Make sure that the name is unique amongst all
1329
- projects that use the same production scheduler. The name may
1330
- contain only lowercase alphanumeric characters and underscores.
1331
-
1332
- branch : Optional[str], default None
1333
- The branch to use. If not specified, the branch is set to
1334
- `user.<username>` unless `production` is set to `True`. This can
1335
- also be set on the command line using `--branch` as a top-level option.
1336
- It is an error to specify `branch` in the decorator and on the command line.
1337
-
1338
- production : bool, default False
1339
- Whether or not the branch is the production branch. This can also be set on the
1340
- command line using `--production` as a top-level option. It is an error to specify
1341
- `production` in the decorator and on the command line.
1342
- The project branch name will be:
1343
- - if `branch` is specified:
1344
- - if `production` is True: `prod.<branch>`
1345
- - if `production` is False: `test.<branch>`
1346
- - if `branch` is not specified:
1347
- - if `production` is True: `prod`
1348
- - if `production` is False: `user.<username>`
1344
+ packages : Dict[str, str], default: {}
1345
+ Packages to use for this flow. The key is the name of the package
1346
+ and the value is the version to use.
1347
+ python : str, optional, default: None
1348
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1349
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1349
1350
  """
1350
1351
  ...
1351
1352