ob-metaflow-stubs 6.0.10.1__py2.py3-none-any.whl → 6.0.10.2__py2.py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of ob-metaflow-stubs might be problematic. Click here for more details.

Files changed (262) hide show
  1. metaflow-stubs/__init__.pyi +984 -984
  2. metaflow-stubs/cards.pyi +2 -2
  3. metaflow-stubs/cli.pyi +2 -2
  4. metaflow-stubs/cli_components/__init__.pyi +2 -2
  5. metaflow-stubs/cli_components/utils.pyi +2 -2
  6. metaflow-stubs/client/__init__.pyi +2 -2
  7. metaflow-stubs/client/core.pyi +4 -4
  8. metaflow-stubs/client/filecache.pyi +2 -2
  9. metaflow-stubs/events.pyi +2 -2
  10. metaflow-stubs/exception.pyi +2 -2
  11. metaflow-stubs/flowspec.pyi +3 -3
  12. metaflow-stubs/generated_for.txt +1 -1
  13. metaflow-stubs/includefile.pyi +4 -4
  14. metaflow-stubs/meta_files.pyi +2 -2
  15. metaflow-stubs/metadata_provider/__init__.pyi +2 -2
  16. metaflow-stubs/metadata_provider/heartbeat.pyi +2 -2
  17. metaflow-stubs/metadata_provider/metadata.pyi +2 -2
  18. metaflow-stubs/metadata_provider/util.pyi +2 -2
  19. metaflow-stubs/metaflow_config.pyi +2 -2
  20. metaflow-stubs/metaflow_current.pyi +41 -41
  21. metaflow-stubs/metaflow_git.pyi +2 -2
  22. metaflow-stubs/mf_extensions/__init__.pyi +2 -2
  23. metaflow-stubs/mf_extensions/obcheckpoint/__init__.pyi +2 -2
  24. metaflow-stubs/mf_extensions/obcheckpoint/plugins/__init__.pyi +2 -2
  25. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/__init__.pyi +2 -2
  26. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/__init__.pyi +2 -2
  27. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/async_cards.pyi +2 -2
  28. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/deco_injection_mixin.pyi +2 -2
  29. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/extra_components.pyi +2 -2
  30. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/__init__.pyi +2 -2
  31. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/cards/__init__.pyi +2 -2
  32. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/cards/checkpoint_lister.pyi +4 -4
  33. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/cards/lineage_card.pyi +2 -2
  34. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/checkpoint_storage.pyi +4 -4
  35. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/constructors.pyi +2 -2
  36. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/core.pyi +3 -3
  37. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/decorator.pyi +4 -4
  38. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/exceptions.pyi +2 -2
  39. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/final_api.pyi +2 -2
  40. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/lineage.pyi +2 -2
  41. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/__init__.pyi +2 -2
  42. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/context.pyi +2 -2
  43. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/core.pyi +3 -3
  44. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/decorator.pyi +2 -2
  45. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/exceptions.pyi +2 -2
  46. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/task_utils.pyi +2 -2
  47. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/utils.pyi +2 -2
  48. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastructures.pyi +3 -3
  49. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/exceptions.pyi +2 -2
  50. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/hf_hub/__init__.pyi +2 -2
  51. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/hf_hub/decorator.pyi +3 -3
  52. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/__init__.pyi +2 -2
  53. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/core.pyi +3 -3
  54. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/exceptions.pyi +2 -2
  55. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/model_storage.pyi +3 -3
  56. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/__init__.pyi +2 -2
  57. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/flowspec_utils.pyi +2 -2
  58. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/general.pyi +2 -2
  59. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/identity_utils.pyi +2 -2
  60. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/serialization_handler/__init__.pyi +2 -2
  61. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/serialization_handler/base.pyi +2 -2
  62. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/serialization_handler/tar.pyi +2 -2
  63. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/tar_utils.pyi +2 -2
  64. metaflow-stubs/mf_extensions/outerbounds/__init__.pyi +2 -2
  65. metaflow-stubs/mf_extensions/outerbounds/plugins/__init__.pyi +2 -2
  66. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/__init__.pyi +2 -2
  67. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/__init__.pyi +2 -2
  68. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_state_machine.pyi +2 -2
  69. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_vendor/__init__.pyi +2 -2
  70. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_vendor/spinner/__init__.pyi +2 -2
  71. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_vendor/spinner/spinners.pyi +2 -2
  72. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/app_cli.pyi +3 -3
  73. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/app_config.pyi +2 -2
  74. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/capsule.pyi +3 -3
  75. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/click_importer.pyi +2 -2
  76. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/code_package/__init__.pyi +2 -2
  77. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/code_package/code_packager.pyi +2 -2
  78. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/__init__.pyi +2 -2
  79. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/cli_generator.pyi +2 -2
  80. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/config_utils.pyi +4 -4
  81. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/schema_export.pyi +2 -2
  82. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/typed_configs.pyi +3 -3
  83. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/unified_config.pyi +4 -4
  84. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/dependencies.pyi +3 -3
  85. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/deployer.pyi +4 -4
  86. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/experimental/__init__.pyi +2 -2
  87. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/perimeters.pyi +2 -2
  88. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/utils.pyi +3 -3
  89. metaflow-stubs/mf_extensions/outerbounds/plugins/aws/__init__.pyi +2 -2
  90. metaflow-stubs/mf_extensions/outerbounds/plugins/aws/assume_role_decorator.pyi +3 -3
  91. metaflow-stubs/mf_extensions/outerbounds/plugins/card_utilities/__init__.pyi +2 -2
  92. metaflow-stubs/mf_extensions/outerbounds/plugins/card_utilities/async_cards.pyi +2 -2
  93. metaflow-stubs/mf_extensions/outerbounds/plugins/card_utilities/injector.pyi +2 -2
  94. metaflow-stubs/mf_extensions/outerbounds/plugins/checkpoint_datastores/__init__.pyi +2 -2
  95. metaflow-stubs/mf_extensions/outerbounds/plugins/checkpoint_datastores/coreweave.pyi +3 -3
  96. metaflow-stubs/mf_extensions/outerbounds/plugins/checkpoint_datastores/nebius.pyi +3 -3
  97. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/__init__.pyi +2 -2
  98. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/baker.pyi +3 -3
  99. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/docker_environment.pyi +3 -3
  100. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/fast_bakery.pyi +2 -2
  101. metaflow-stubs/mf_extensions/outerbounds/plugins/kubernetes/__init__.pyi +2 -2
  102. metaflow-stubs/mf_extensions/outerbounds/plugins/kubernetes/pod_killer.pyi +2 -2
  103. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/__init__.pyi +2 -2
  104. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/constants.pyi +2 -2
  105. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/exceptions.pyi +2 -2
  106. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/ollama.pyi +2 -2
  107. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/status_card.pyi +2 -2
  108. metaflow-stubs/mf_extensions/outerbounds/plugins/snowflake/__init__.pyi +2 -2
  109. metaflow-stubs/mf_extensions/outerbounds/plugins/snowflake/snowflake.pyi +2 -2
  110. metaflow-stubs/mf_extensions/outerbounds/profilers/__init__.pyi +2 -2
  111. metaflow-stubs/mf_extensions/outerbounds/profilers/gpu.pyi +2 -2
  112. metaflow-stubs/mf_extensions/outerbounds/remote_config.pyi +2 -2
  113. metaflow-stubs/mf_extensions/outerbounds/toplevel/__init__.pyi +2 -2
  114. metaflow-stubs/mf_extensions/outerbounds/toplevel/global_aliases_for_metaflow_package.pyi +2 -2
  115. metaflow-stubs/mf_extensions/outerbounds/toplevel/s3_proxy.pyi +2 -2
  116. metaflow-stubs/multicore_utils.pyi +2 -2
  117. metaflow-stubs/ob_internal.pyi +2 -2
  118. metaflow-stubs/packaging_sys/__init__.pyi +5 -5
  119. metaflow-stubs/packaging_sys/backend.pyi +3 -3
  120. metaflow-stubs/packaging_sys/distribution_support.pyi +4 -4
  121. metaflow-stubs/packaging_sys/tar_backend.pyi +6 -6
  122. metaflow-stubs/packaging_sys/utils.pyi +2 -2
  123. metaflow-stubs/packaging_sys/v1.pyi +3 -3
  124. metaflow-stubs/parameters.pyi +5 -5
  125. metaflow-stubs/plugins/__init__.pyi +9 -9
  126. metaflow-stubs/plugins/airflow/__init__.pyi +2 -2
  127. metaflow-stubs/plugins/airflow/airflow_utils.pyi +2 -2
  128. metaflow-stubs/plugins/airflow/exception.pyi +2 -2
  129. metaflow-stubs/plugins/airflow/sensors/__init__.pyi +2 -2
  130. metaflow-stubs/plugins/airflow/sensors/base_sensor.pyi +2 -2
  131. metaflow-stubs/plugins/airflow/sensors/external_task_sensor.pyi +2 -2
  132. metaflow-stubs/plugins/airflow/sensors/s3_sensor.pyi +2 -2
  133. metaflow-stubs/plugins/argo/__init__.pyi +2 -2
  134. metaflow-stubs/plugins/argo/argo_client.pyi +2 -2
  135. metaflow-stubs/plugins/argo/argo_events.pyi +2 -2
  136. metaflow-stubs/plugins/argo/argo_workflows.pyi +2 -2
  137. metaflow-stubs/plugins/argo/argo_workflows_decorator.pyi +3 -3
  138. metaflow-stubs/plugins/argo/argo_workflows_deployer.pyi +3 -3
  139. metaflow-stubs/plugins/argo/argo_workflows_deployer_objects.pyi +3 -3
  140. metaflow-stubs/plugins/argo/exit_hooks.pyi +3 -3
  141. metaflow-stubs/plugins/aws/__init__.pyi +2 -2
  142. metaflow-stubs/plugins/aws/aws_client.pyi +2 -2
  143. metaflow-stubs/plugins/aws/aws_utils.pyi +2 -2
  144. metaflow-stubs/plugins/aws/batch/__init__.pyi +2 -2
  145. metaflow-stubs/plugins/aws/batch/batch.pyi +2 -2
  146. metaflow-stubs/plugins/aws/batch/batch_client.pyi +2 -2
  147. metaflow-stubs/plugins/aws/batch/batch_decorator.pyi +2 -2
  148. metaflow-stubs/plugins/aws/secrets_manager/__init__.pyi +2 -2
  149. metaflow-stubs/plugins/aws/secrets_manager/aws_secrets_manager_secrets_provider.pyi +3 -3
  150. metaflow-stubs/plugins/aws/step_functions/__init__.pyi +2 -2
  151. metaflow-stubs/plugins/aws/step_functions/event_bridge_client.pyi +2 -2
  152. metaflow-stubs/plugins/aws/step_functions/schedule_decorator.pyi +2 -2
  153. metaflow-stubs/plugins/aws/step_functions/step_functions.pyi +2 -2
  154. metaflow-stubs/plugins/aws/step_functions/step_functions_client.pyi +2 -2
  155. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer.pyi +3 -3
  156. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer_objects.pyi +3 -3
  157. metaflow-stubs/plugins/azure/__init__.pyi +2 -2
  158. metaflow-stubs/plugins/azure/azure_credential.pyi +2 -2
  159. metaflow-stubs/plugins/azure/azure_exceptions.pyi +2 -2
  160. metaflow-stubs/plugins/azure/azure_secret_manager_secrets_provider.pyi +3 -3
  161. metaflow-stubs/plugins/azure/azure_utils.pyi +2 -2
  162. metaflow-stubs/plugins/azure/blob_service_client_factory.pyi +2 -2
  163. metaflow-stubs/plugins/azure/includefile_support.pyi +2 -2
  164. metaflow-stubs/plugins/cards/__init__.pyi +2 -2
  165. metaflow-stubs/plugins/cards/card_client.pyi +2 -2
  166. metaflow-stubs/plugins/cards/card_creator.pyi +2 -2
  167. metaflow-stubs/plugins/cards/card_datastore.pyi +2 -2
  168. metaflow-stubs/plugins/cards/card_decorator.pyi +3 -3
  169. metaflow-stubs/plugins/cards/card_modules/__init__.pyi +2 -2
  170. metaflow-stubs/plugins/cards/card_modules/basic.pyi +2 -2
  171. metaflow-stubs/plugins/cards/card_modules/card.pyi +2 -2
  172. metaflow-stubs/plugins/cards/card_modules/components.pyi +3 -3
  173. metaflow-stubs/plugins/cards/card_modules/convert_to_native_type.pyi +2 -2
  174. metaflow-stubs/plugins/cards/card_modules/renderer_tools.pyi +2 -2
  175. metaflow-stubs/plugins/cards/card_modules/test_cards.pyi +2 -2
  176. metaflow-stubs/plugins/cards/card_resolver.pyi +2 -2
  177. metaflow-stubs/plugins/cards/component_serializer.pyi +2 -2
  178. metaflow-stubs/plugins/cards/exception.pyi +2 -2
  179. metaflow-stubs/plugins/catch_decorator.pyi +2 -2
  180. metaflow-stubs/plugins/datatools/__init__.pyi +2 -2
  181. metaflow-stubs/plugins/datatools/local.pyi +2 -2
  182. metaflow-stubs/plugins/datatools/s3/__init__.pyi +2 -2
  183. metaflow-stubs/plugins/datatools/s3/s3.pyi +3 -3
  184. metaflow-stubs/plugins/datatools/s3/s3tail.pyi +2 -2
  185. metaflow-stubs/plugins/datatools/s3/s3util.pyi +2 -2
  186. metaflow-stubs/plugins/debug_logger.pyi +2 -2
  187. metaflow-stubs/plugins/debug_monitor.pyi +2 -2
  188. metaflow-stubs/plugins/environment_decorator.pyi +2 -2
  189. metaflow-stubs/plugins/events_decorator.pyi +2 -2
  190. metaflow-stubs/plugins/exit_hook/__init__.pyi +2 -2
  191. metaflow-stubs/plugins/exit_hook/exit_hook_decorator.pyi +2 -2
  192. metaflow-stubs/plugins/frameworks/__init__.pyi +2 -2
  193. metaflow-stubs/plugins/frameworks/pytorch.pyi +2 -2
  194. metaflow-stubs/plugins/gcp/__init__.pyi +2 -2
  195. metaflow-stubs/plugins/gcp/gcp_secret_manager_secrets_provider.pyi +3 -3
  196. metaflow-stubs/plugins/gcp/gs_exceptions.pyi +2 -2
  197. metaflow-stubs/plugins/gcp/gs_storage_client_factory.pyi +2 -2
  198. metaflow-stubs/plugins/gcp/gs_utils.pyi +2 -2
  199. metaflow-stubs/plugins/gcp/includefile_support.pyi +2 -2
  200. metaflow-stubs/plugins/kubernetes/__init__.pyi +2 -2
  201. metaflow-stubs/plugins/kubernetes/kube_utils.pyi +2 -2
  202. metaflow-stubs/plugins/kubernetes/kubernetes.pyi +2 -2
  203. metaflow-stubs/plugins/kubernetes/kubernetes_client.pyi +2 -2
  204. metaflow-stubs/plugins/kubernetes/kubernetes_decorator.pyi +2 -2
  205. metaflow-stubs/plugins/kubernetes/kubernetes_jobsets.pyi +2 -2
  206. metaflow-stubs/plugins/kubernetes/spot_monitor_sidecar.pyi +2 -2
  207. metaflow-stubs/plugins/ollama/__init__.pyi +3 -3
  208. metaflow-stubs/plugins/optuna/__init__.pyi +2 -2
  209. metaflow-stubs/plugins/parallel_decorator.pyi +2 -2
  210. metaflow-stubs/plugins/perimeters.pyi +2 -2
  211. metaflow-stubs/plugins/project_decorator.pyi +2 -2
  212. metaflow-stubs/plugins/pypi/__init__.pyi +2 -2
  213. metaflow-stubs/plugins/pypi/conda_decorator.pyi +2 -2
  214. metaflow-stubs/plugins/pypi/conda_environment.pyi +4 -4
  215. metaflow-stubs/plugins/pypi/parsers.pyi +2 -2
  216. metaflow-stubs/plugins/pypi/pypi_decorator.pyi +2 -2
  217. metaflow-stubs/plugins/pypi/pypi_environment.pyi +2 -2
  218. metaflow-stubs/plugins/pypi/utils.pyi +2 -2
  219. metaflow-stubs/plugins/resources_decorator.pyi +2 -2
  220. metaflow-stubs/plugins/retry_decorator.pyi +2 -2
  221. metaflow-stubs/plugins/secrets/__init__.pyi +2 -2
  222. metaflow-stubs/plugins/secrets/inline_secrets_provider.pyi +3 -3
  223. metaflow-stubs/plugins/secrets/secrets_decorator.pyi +2 -2
  224. metaflow-stubs/plugins/secrets/secrets_func.pyi +2 -2
  225. metaflow-stubs/plugins/secrets/secrets_spec.pyi +2 -2
  226. metaflow-stubs/plugins/secrets/utils.pyi +2 -2
  227. metaflow-stubs/plugins/snowflake/__init__.pyi +2 -2
  228. metaflow-stubs/plugins/storage_executor.pyi +2 -2
  229. metaflow-stubs/plugins/test_unbounded_foreach_decorator.pyi +3 -3
  230. metaflow-stubs/plugins/timeout_decorator.pyi +2 -2
  231. metaflow-stubs/plugins/torchtune/__init__.pyi +2 -2
  232. metaflow-stubs/plugins/uv/__init__.pyi +2 -2
  233. metaflow-stubs/plugins/uv/uv_environment.pyi +3 -3
  234. metaflow-stubs/profilers/__init__.pyi +2 -2
  235. metaflow-stubs/pylint_wrapper.pyi +2 -2
  236. metaflow-stubs/runner/__init__.pyi +2 -2
  237. metaflow-stubs/runner/deployer.pyi +33 -33
  238. metaflow-stubs/runner/deployer_impl.pyi +3 -3
  239. metaflow-stubs/runner/metaflow_runner.pyi +4 -4
  240. metaflow-stubs/runner/nbdeploy.pyi +2 -2
  241. metaflow-stubs/runner/nbrun.pyi +2 -2
  242. metaflow-stubs/runner/subprocess_manager.pyi +2 -2
  243. metaflow-stubs/runner/utils.pyi +3 -3
  244. metaflow-stubs/system/__init__.pyi +2 -2
  245. metaflow-stubs/system/system_logger.pyi +3 -3
  246. metaflow-stubs/system/system_monitor.pyi +2 -2
  247. metaflow-stubs/tagging_util.pyi +2 -2
  248. metaflow-stubs/tuple_util.pyi +2 -2
  249. metaflow-stubs/user_configs/__init__.pyi +2 -2
  250. metaflow-stubs/user_configs/config_options.pyi +3 -3
  251. metaflow-stubs/user_configs/config_parameters.pyi +5 -5
  252. metaflow-stubs/user_decorators/__init__.pyi +2 -2
  253. metaflow-stubs/user_decorators/common.pyi +2 -2
  254. metaflow-stubs/user_decorators/mutable_flow.pyi +6 -6
  255. metaflow-stubs/user_decorators/mutable_step.pyi +5 -5
  256. metaflow-stubs/user_decorators/user_flow_decorator.pyi +5 -5
  257. metaflow-stubs/user_decorators/user_step_decorator.pyi +6 -6
  258. {ob_metaflow_stubs-6.0.10.1.dist-info → ob_metaflow_stubs-6.0.10.2.dist-info}/METADATA +1 -1
  259. ob_metaflow_stubs-6.0.10.2.dist-info/RECORD +262 -0
  260. ob_metaflow_stubs-6.0.10.1.dist-info/RECORD +0 -262
  261. {ob_metaflow_stubs-6.0.10.1.dist-info → ob_metaflow_stubs-6.0.10.2.dist-info}/WHEEL +0 -0
  262. {ob_metaflow_stubs-6.0.10.1.dist-info → ob_metaflow_stubs-6.0.10.2.dist-info}/top_level.txt +0 -0
@@ -1,15 +1,15 @@
1
1
  ######################################################################################################
2
2
  # Auto-generated Metaflow stub file #
3
- # MF version: 2.18.3.2+obcheckpoint(0.2.4);ob(v1) #
4
- # Generated on 2025-09-09T09:20:35.730188 #
3
+ # MF version: 2.18.5.1+obcheckpoint(0.2.4);ob(v1) #
4
+ # Generated on 2025-09-16T01:38:51.425889 #
5
5
  ######################################################################################################
6
6
 
7
7
  from __future__ import annotations
8
8
 
9
9
  import typing
10
10
  if typing.TYPE_CHECKING:
11
- import typing
12
11
  import datetime
12
+ import typing
13
13
  FlowSpecDerived = typing.TypeVar("FlowSpecDerived", bound="FlowSpec", contravariant=False, covariant=False)
14
14
  StepFlag = typing.NewType("StepFlag", bool)
15
15
 
@@ -48,9 +48,9 @@ from . import plugins as plugins
48
48
  from .mf_extensions.outerbounds.toplevel.global_aliases_for_metaflow_package import S3 as S3
49
49
  from . import includefile as includefile
50
50
  from .includefile import IncludeFile as IncludeFile
51
- from .plugins.pypi.parsers import pyproject_toml_parser as pyproject_toml_parser
52
51
  from .plugins.pypi.parsers import conda_environment_yml_parser as conda_environment_yml_parser
53
52
  from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
53
+ from .plugins.pypi.parsers import pyproject_toml_parser as pyproject_toml_parser
54
54
  from . import client as client
55
55
  from .client.core import namespace as namespace
56
56
  from .client.core import get_namespace as get_namespace
@@ -167,108 +167,7 @@ def step(f: typing.Union[typing.Callable[[FlowSpecDerived], None], typing.Callab
167
167
  """
168
168
  ...
169
169
 
170
- def vllm(*, model: str, backend: str, openai_api_server: bool, debug: bool, card_refresh_interval: int, max_retries: int, retry_alert_frequency: int, engine_args: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
171
- """
172
- This decorator is used to run vllm APIs as Metaflow task sidecars.
173
-
174
- User code call
175
- --------------
176
- @vllm(
177
- model="...",
178
- ...
179
- )
180
-
181
- Valid backend options
182
- ---------------------
183
- - 'local': Run as a separate process on the local task machine.
184
-
185
- Valid model options
186
- -------------------
187
- Any HuggingFace model identifier, e.g. 'meta-llama/Llama-3.2-1B'
188
-
189
- NOTE: vLLM's OpenAI-compatible server serves ONE model per server instance.
190
- If you need multiple models, you must create multiple @vllm decorators.
191
-
192
-
193
- Parameters
194
- ----------
195
- model: str
196
- HuggingFace model identifier to be served by vLLM.
197
- backend: str
198
- Determines where and how to run the vLLM process.
199
- openai_api_server: bool
200
- Whether to use OpenAI-compatible API server mode (subprocess) instead of native engine.
201
- Default is False (uses native engine).
202
- Set to True for backward compatibility with existing code.
203
- debug: bool
204
- Whether to turn on verbose debugging logs.
205
- card_refresh_interval: int
206
- Interval in seconds for refreshing the vLLM status card.
207
- Only used when openai_api_server=True.
208
- max_retries: int
209
- Maximum number of retries checking for vLLM server startup.
210
- Only used when openai_api_server=True.
211
- retry_alert_frequency: int
212
- Frequency of alert logs for vLLM server startup retries.
213
- Only used when openai_api_server=True.
214
- engine_args : dict
215
- Additional keyword arguments to pass to the vLLM engine.
216
- For example, `tensor_parallel_size=2`.
217
- """
218
- ...
219
-
220
- @typing.overload
221
- def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
222
- """
223
- Specifies that the step will success under all circumstances.
224
-
225
- The decorator will create an optional artifact, specified by `var`, which
226
- contains the exception raised. You can use it to detect the presence
227
- of errors, indicating that all happy-path artifacts produced by the step
228
- are missing.
229
-
230
-
231
- Parameters
232
- ----------
233
- var : str, optional, default None
234
- Name of the artifact in which to store the caught exception.
235
- If not specified, the exception is not stored.
236
- print_exception : bool, default True
237
- Determines whether or not the exception is printed to
238
- stdout when caught.
239
- """
240
- ...
241
-
242
- @typing.overload
243
- def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
244
- ...
245
-
246
- @typing.overload
247
- def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
248
- ...
249
-
250
- def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
251
- """
252
- Specifies that the step will success under all circumstances.
253
-
254
- The decorator will create an optional artifact, specified by `var`, which
255
- contains the exception raised. You can use it to detect the presence
256
- of errors, indicating that all happy-path artifacts produced by the step
257
- are missing.
258
-
259
-
260
- Parameters
261
- ----------
262
- var : str, optional, default None
263
- Name of the artifact in which to store the caught exception.
264
- If not specified, the exception is not stored.
265
- print_exception : bool, default True
266
- Determines whether or not the exception is printed to
267
- stdout when caught.
268
- """
269
- ...
270
-
271
- def nvidia(*, gpu: int, gpu_type: str, queue_timeout: int) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
170
+ def nvct(*, gpu: int, gpu_type: str) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
272
171
  """
273
172
  Specifies that this step should execute on DGX cloud.
274
173
 
@@ -279,97 +178,6 @@ def nvidia(*, gpu: int, gpu_type: str, queue_timeout: int) -> typing.Callable[[t
279
178
  Number of GPUs to use.
280
179
  gpu_type : str
281
180
  Type of Nvidia GPU to use.
282
- queue_timeout : int
283
- Time to keep the job in NVCF's queue.
284
- """
285
- ...
286
-
287
- def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: typing.Optional[str] = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.Optional[typing.List[str]] = None, node_selector: typing.Union[typing.Dict[str, str], str, None] = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: typing.Optional[int] = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = '/metaflow_temp', persistent_volume_claims: typing.Optional[typing.Dict[str, str]] = None, shared_memory: typing.Optional[int] = None, port: typing.Optional[int] = None, compute_pool: typing.Optional[str] = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Optional[typing.Dict[str, typing.Any]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
288
- """
289
- Specifies that this step should execute on Kubernetes.
290
-
291
-
292
- Parameters
293
- ----------
294
- cpu : int, default 1
295
- Number of CPUs required for this step. If `@resources` is
296
- also present, the maximum value from all decorators is used.
297
- memory : int, default 4096
298
- Memory size (in MB) required for this step. If
299
- `@resources` is also present, the maximum value from all decorators is
300
- used.
301
- disk : int, default 10240
302
- Disk size (in MB) required for this step. If
303
- `@resources` is also present, the maximum value from all decorators is
304
- used.
305
- image : str, optional, default None
306
- Docker image to use when launching on Kubernetes. If not specified, and
307
- METAFLOW_KUBERNETES_CONTAINER_IMAGE is specified, that image is used. If
308
- not, a default Docker image mapping to the current version of Python is used.
309
- image_pull_policy: str, default KUBERNETES_IMAGE_PULL_POLICY
310
- If given, the imagePullPolicy to be applied to the Docker image of the step.
311
- image_pull_secrets: List[str], default []
312
- The default is extracted from METAFLOW_KUBERNETES_IMAGE_PULL_SECRETS.
313
- Kubernetes image pull secrets to use when pulling container images
314
- in Kubernetes.
315
- service_account : str, default METAFLOW_KUBERNETES_SERVICE_ACCOUNT
316
- Kubernetes service account to use when launching pod in Kubernetes.
317
- secrets : List[str], optional, default None
318
- Kubernetes secrets to use when launching pod in Kubernetes. These
319
- secrets are in addition to the ones defined in `METAFLOW_KUBERNETES_SECRETS`
320
- in Metaflow configuration.
321
- node_selector: Union[Dict[str,str], str], optional, default None
322
- Kubernetes node selector(s) to apply to the pod running the task.
323
- Can be passed in as a comma separated string of values e.g.
324
- 'kubernetes.io/os=linux,kubernetes.io/arch=amd64' or as a dictionary
325
- {'kubernetes.io/os': 'linux', 'kubernetes.io/arch': 'amd64'}
326
- namespace : str, default METAFLOW_KUBERNETES_NAMESPACE
327
- Kubernetes namespace to use when launching pod in Kubernetes.
328
- gpu : int, optional, default None
329
- Number of GPUs required for this step. A value of zero implies that
330
- the scheduled node should not have GPUs.
331
- gpu_vendor : str, default KUBERNETES_GPU_VENDOR
332
- The vendor of the GPUs to be used for this step.
333
- tolerations : List[Dict[str,str]], default []
334
- The default is extracted from METAFLOW_KUBERNETES_TOLERATIONS.
335
- Kubernetes tolerations to use when launching pod in Kubernetes.
336
- labels: Dict[str, str], default: METAFLOW_KUBERNETES_LABELS
337
- Kubernetes labels to use when launching pod in Kubernetes.
338
- annotations: Dict[str, str], default: METAFLOW_KUBERNETES_ANNOTATIONS
339
- Kubernetes annotations to use when launching pod in Kubernetes.
340
- use_tmpfs : bool, default False
341
- This enables an explicit tmpfs mount for this step.
342
- tmpfs_tempdir : bool, default True
343
- sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
344
- tmpfs_size : int, optional, default: None
345
- The value for the size (in MiB) of the tmpfs mount for this step.
346
- This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
347
- memory allocated for this step.
348
- tmpfs_path : str, optional, default /metaflow_temp
349
- Path to tmpfs mount for this step.
350
- persistent_volume_claims : Dict[str, str], optional, default None
351
- A map (dictionary) of persistent volumes to be mounted to the pod for this step. The map is from persistent
352
- volumes to the path to which the volume is to be mounted, e.g., `{'pvc-name': '/path/to/mount/on'}`.
353
- shared_memory: int, optional
354
- Shared memory size (in MiB) required for this step
355
- port: int, optional
356
- Port number to specify in the Kubernetes job object
357
- compute_pool : str, optional, default None
358
- Compute pool to be used for for this step.
359
- If not specified, any accessible compute pool within the perimeter is used.
360
- hostname_resolution_timeout: int, default 10 * 60
361
- Timeout in seconds for the workers tasks in the gang scheduled cluster to resolve the hostname of control task.
362
- Only applicable when @parallel is used.
363
- qos: str, default: Burstable
364
- Quality of Service class to assign to the pod. Supported values are: Guaranteed, Burstable, BestEffort
365
-
366
- security_context: Dict[str, Any], optional, default None
367
- Container security context. Applies to the task container. Allows the following keys:
368
- - privileged: bool, optional, default None
369
- - allow_privilege_escalation: bool, optional, default None
370
- - run_as_user: int, optional, default None
371
- - run_as_group: int, optional, default None
372
- - run_as_non_root: bool, optional, default None
373
181
  """
374
182
  ...
375
183
 
@@ -503,103 +311,93 @@ def model(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], ty
503
311
  ...
504
312
 
505
313
  @typing.overload
506
- def resources(*, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
314
+ def test_append_card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
507
315
  """
508
- Specifies the resources needed when executing this step.
509
-
510
- Use `@resources` to specify the resource requirements
511
- independently of the specific compute layer (`@batch`, `@kubernetes`).
512
-
513
- You can choose the compute layer on the command line by executing e.g.
514
- ```
515
- python myflow.py run --with batch
516
- ```
517
- or
518
- ```
519
- python myflow.py run --with kubernetes
520
- ```
521
- which executes the flow on the desired system using the
522
- requirements specified in `@resources`.
523
-
524
-
525
- Parameters
526
- ----------
527
- cpu : int, default 1
528
- Number of CPUs required for this step.
529
- gpu : int, optional, default None
530
- Number of GPUs required for this step.
531
- disk : int, optional, default None
532
- Disk size (in MB) required for this step. Only applies on Kubernetes.
533
- memory : int, default 4096
534
- Memory size (in MB) required for this step.
535
- shared_memory : int, optional, default None
536
- The value for the size (in MiB) of the /dev/shm volume for this step.
537
- This parameter maps to the `--shm-size` option in Docker.
316
+ A simple decorator that demonstrates using CardDecoratorInjector
317
+ to inject a card and render simple markdown content.
538
318
  """
539
319
  ...
540
320
 
541
321
  @typing.overload
542
- def resources(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
322
+ def test_append_card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
323
+ ...
324
+
325
+ def test_append_card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
326
+ """
327
+ A simple decorator that demonstrates using CardDecoratorInjector
328
+ to inject a card and render simple markdown content.
329
+ """
543
330
  ...
544
331
 
545
332
  @typing.overload
546
- def resources(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
333
+ def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
334
+ """
335
+ Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
336
+ It exists to make it easier for users to know that this decorator should only be used with
337
+ a Neo Cloud like Nebius.
338
+ """
547
339
  ...
548
340
 
549
- def resources(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None):
341
+ @typing.overload
342
+ def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
343
+ ...
344
+
345
+ def nebius_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
550
346
  """
551
- Specifies the resources needed when executing this step.
552
-
553
- Use `@resources` to specify the resource requirements
554
- independently of the specific compute layer (`@batch`, `@kubernetes`).
347
+ Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
348
+ It exists to make it easier for users to know that this decorator should only be used with
349
+ a Neo Cloud like Nebius.
350
+ """
351
+ ...
352
+
353
+ @typing.overload
354
+ def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
355
+ """
356
+ Specifies the PyPI packages for the step.
555
357
 
556
- You can choose the compute layer on the command line by executing e.g.
557
- ```
558
- python myflow.py run --with batch
559
- ```
560
- or
561
- ```
562
- python myflow.py run --with kubernetes
563
- ```
564
- which executes the flow on the desired system using the
565
- requirements specified in `@resources`.
358
+ Information in this decorator will augment any
359
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
360
+ you can use `@pypi_base` to set packages required by all
361
+ steps and use `@pypi` to specify step-specific overrides.
566
362
 
567
363
 
568
364
  Parameters
569
365
  ----------
570
- cpu : int, default 1
571
- Number of CPUs required for this step.
572
- gpu : int, optional, default None
573
- Number of GPUs required for this step.
574
- disk : int, optional, default None
575
- Disk size (in MB) required for this step. Only applies on Kubernetes.
576
- memory : int, default 4096
577
- Memory size (in MB) required for this step.
578
- shared_memory : int, optional, default None
579
- The value for the size (in MiB) of the /dev/shm volume for this step.
580
- This parameter maps to the `--shm-size` option in Docker.
366
+ packages : Dict[str, str], default: {}
367
+ Packages to use for this step. The key is the name of the package
368
+ and the value is the version to use.
369
+ python : str, optional, default: None
370
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
371
+ that the version used will correspond to the version of the Python interpreter used to start the run.
581
372
  """
582
373
  ...
583
374
 
584
- def s3_proxy(*, integration_name: typing.Optional[str] = None, write_mode: typing.Optional[str] = None, debug: typing.Optional[bool] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
375
+ @typing.overload
376
+ def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
377
+ ...
378
+
379
+ @typing.overload
380
+ def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
381
+ ...
382
+
383
+ def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
585
384
  """
586
- S3 Proxy decorator for routing S3 requests through a local proxy service.
385
+ Specifies the PyPI packages for the step.
386
+
387
+ Information in this decorator will augment any
388
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
389
+ you can use `@pypi_base` to set packages required by all
390
+ steps and use `@pypi` to specify step-specific overrides.
587
391
 
588
392
 
589
393
  Parameters
590
394
  ----------
591
- integration_name : str, optional
592
- Name of the S3 proxy integration. If not specified, will use the only
593
- available S3 proxy integration in the namespace (fails if multiple exist).
594
- write_mode : str, optional
595
- The desired behavior during write operations to target (origin) S3 bucket.
596
- allowed options are:
597
- "origin-and-cache" -> write to both the target S3 bucket and local object
598
- storage
599
- "origin" -> only write to the target S3 bucket
600
- "cache" -> only write to the object storage service used for caching
601
- debug : bool, optional
602
- Enable debug logging for proxy operations.
395
+ packages : Dict[str, str], default: {}
396
+ Packages to use for this step. The key is the name of the package
397
+ and the value is the version to use.
398
+ python : str, optional, default: None
399
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
400
+ that the version used will correspond to the version of the Python interpreter used to start the run.
603
401
  """
604
402
  ...
605
403
 
@@ -662,87 +460,46 @@ def timeout(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None],
662
460
  """
663
461
  ...
664
462
 
665
- @typing.overload
666
- def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
667
- """
668
- Specifies environment variables to be set prior to the execution of a step.
669
-
670
-
671
- Parameters
672
- ----------
673
- vars : Dict[str, str], default {}
674
- Dictionary of environment variables to set.
675
- """
676
- ...
677
-
678
- @typing.overload
679
- def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
680
- ...
681
-
682
- @typing.overload
683
- def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
684
- ...
685
-
686
- def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
687
- """
688
- Specifies environment variables to be set prior to the execution of a step.
689
-
690
-
691
- Parameters
692
- ----------
693
- vars : Dict[str, str], default {}
694
- Dictionary of environment variables to set.
695
- """
696
- ...
697
-
698
- @typing.overload
699
- def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
463
+ def ollama(*, models: list, backend: str, force_pull: bool, cache_update_policy: str, force_cache_update: bool, debug: bool, circuit_breaker_config: dict, timeout_config: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
700
464
  """
701
- Specifies the PyPI packages for the step.
702
-
703
- Information in this decorator will augment any
704
- attributes set in the `@pyi_base` flow-level decorator. Hence,
705
- you can use `@pypi_base` to set packages required by all
706
- steps and use `@pypi` to specify step-specific overrides.
465
+ This decorator is used to run Ollama APIs as Metaflow task sidecars.
707
466
 
467
+ User code call
468
+ --------------
469
+ @ollama(
470
+ models=[...],
471
+ ...
472
+ )
708
473
 
709
- Parameters
710
- ----------
711
- packages : Dict[str, str], default: {}
712
- Packages to use for this step. The key is the name of the package
713
- and the value is the version to use.
714
- python : str, optional, default: None
715
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
716
- that the version used will correspond to the version of the Python interpreter used to start the run.
717
- """
718
- ...
719
-
720
- @typing.overload
721
- def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
722
- ...
723
-
724
- @typing.overload
725
- def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
726
- ...
727
-
728
- def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
729
- """
730
- Specifies the PyPI packages for the step.
474
+ Valid backend options
475
+ ---------------------
476
+ - 'local': Run as a separate process on the local task machine.
477
+ - (TODO) 'managed': Outerbounds hosts and selects compute provider.
478
+ - (TODO) 'remote': Spin up separate instance to serve Ollama models.
731
479
 
732
- Information in this decorator will augment any
733
- attributes set in the `@pyi_base` flow-level decorator. Hence,
734
- you can use `@pypi_base` to set packages required by all
735
- steps and use `@pypi` to specify step-specific overrides.
480
+ Valid model options
481
+ -------------------
482
+ Any model here https://ollama.com/search, e.g. 'llama3.2', 'llama3.3'
736
483
 
737
484
 
738
485
  Parameters
739
486
  ----------
740
- packages : Dict[str, str], default: {}
741
- Packages to use for this step. The key is the name of the package
742
- and the value is the version to use.
743
- python : str, optional, default: None
744
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
745
- that the version used will correspond to the version of the Python interpreter used to start the run.
487
+ models: list[str]
488
+ List of Ollama containers running models in sidecars.
489
+ backend: str
490
+ Determines where and how to run the Ollama process.
491
+ force_pull: bool
492
+ Whether to run `ollama pull` no matter what, or first check the remote cache in Metaflow datastore for this model key.
493
+ cache_update_policy: str
494
+ Cache update policy: "auto", "force", or "never".
495
+ force_cache_update: bool
496
+ Simple override for "force" cache update policy.
497
+ debug: bool
498
+ Whether to turn on verbose debugging logs.
499
+ circuit_breaker_config: dict
500
+ Configuration for circuit breaker protection. Keys: failure_threshold, recovery_timeout, reset_timeout.
501
+ timeout_config: dict
502
+ Configuration for various operation timeouts. Keys: pull, stop, health_check, install, server_startup.
746
503
  """
747
504
  ...
748
505
 
@@ -827,86 +584,115 @@ def huggingface_hub(*, temp_dir_root: typing.Optional[str] = None, load: typing.
827
584
  ...
828
585
 
829
586
  @typing.overload
830
- def checkpoint(*, load_policy: str = 'fresh', temp_dir_root: str = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
587
+ def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
831
588
  """
832
- Enables checkpointing for a step.
833
-
834
- > Examples
835
-
836
- - Saving Checkpoints
837
-
838
- ```python
839
- @checkpoint
840
- @step
841
- def train(self):
842
- model = create_model(self.parameters, checkpoint_path = None)
843
- for i in range(self.epochs):
844
- # some training logic
845
- loss = model.train(self.dataset)
846
- if i % 10 == 0:
847
- model.save(
848
- current.checkpoint.directory,
849
- )
850
- # saves the contents of the `current.checkpoint.directory` as a checkpoint
851
- # and returns a reference dictionary to the checkpoint saved in the datastore
852
- self.latest_checkpoint = current.checkpoint.save(
853
- name="epoch_checkpoint",
854
- metadata={
855
- "epoch": i,
856
- "loss": loss,
857
- }
858
- )
859
- ```
860
-
861
- - Using Loaded Checkpoints
862
-
863
- ```python
864
- @retry(times=3)
865
- @checkpoint
866
- @step
867
- def train(self):
868
- # Assume that the task has restarted and the previous attempt of the task
869
- # saved a checkpoint
870
- checkpoint_path = None
871
- if current.checkpoint.is_loaded: # Check if a checkpoint is loaded
872
- print("Loaded checkpoint from the previous attempt")
873
- checkpoint_path = current.checkpoint.directory
874
-
875
- model = create_model(self.parameters, checkpoint_path = checkpoint_path)
876
- for i in range(self.epochs):
877
- ...
878
- ```
589
+ Specifies environment variables to be set prior to the execution of a step.
879
590
 
880
591
 
881
592
  Parameters
882
593
  ----------
883
- load_policy : str, default: "fresh"
884
- The policy for loading the checkpoint. The following policies are supported:
885
- - "eager": Loads the the latest available checkpoint within the namespace.
886
- With this mode, the latest checkpoint written by any previous task (can be even a different run) of the step
887
- will be loaded at the start of the task.
888
- - "none": Do not load any checkpoint
889
- - "fresh": Loads the lastest checkpoint created within the running Task.
890
- This mode helps loading checkpoints across various retry attempts of the same task.
891
- With this mode, no checkpoint will be loaded at the start of a task but any checkpoints
892
- created within the task will be loaded when the task is retries execution on failure.
893
-
894
- temp_dir_root : str, default: None
895
- The root directory under which `current.checkpoint.directory` will be created.
594
+ vars : Dict[str, str], default {}
595
+ Dictionary of environment variables to set.
896
596
  """
897
597
  ...
898
598
 
899
599
  @typing.overload
900
- def checkpoint(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
600
+ def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
901
601
  ...
902
602
 
903
603
  @typing.overload
904
- def checkpoint(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
604
+ def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
905
605
  ...
906
606
 
907
- def checkpoint(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, load_policy: str = 'fresh', temp_dir_root: str = None):
607
+ def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
908
608
  """
909
- Enables checkpointing for a step.
609
+ Specifies environment variables to be set prior to the execution of a step.
610
+
611
+
612
+ Parameters
613
+ ----------
614
+ vars : Dict[str, str], default {}
615
+ Dictionary of environment variables to set.
616
+ """
617
+ ...
618
+
619
+ def s3_proxy(*, integration_name: typing.Optional[str] = None, write_mode: typing.Optional[str] = None, debug: typing.Optional[bool] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
620
+ """
621
+ S3 Proxy decorator for routing S3 requests through a local proxy service.
622
+
623
+
624
+ Parameters
625
+ ----------
626
+ integration_name : str, optional
627
+ Name of the S3 proxy integration. If not specified, will use the only
628
+ available S3 proxy integration in the namespace (fails if multiple exist).
629
+ write_mode : str, optional
630
+ The desired behavior during write operations to target (origin) S3 bucket.
631
+ allowed options are:
632
+ "origin-and-cache" -> write to both the target S3 bucket and local object
633
+ storage
634
+ "origin" -> only write to the target S3 bucket
635
+ "cache" -> only write to the object storage service used for caching
636
+ debug : bool, optional
637
+ Enable debug logging for proxy operations.
638
+ """
639
+ ...
640
+
641
+ @typing.overload
642
+ def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
643
+ """
644
+ Specifies that the step will success under all circumstances.
645
+
646
+ The decorator will create an optional artifact, specified by `var`, which
647
+ contains the exception raised. You can use it to detect the presence
648
+ of errors, indicating that all happy-path artifacts produced by the step
649
+ are missing.
650
+
651
+
652
+ Parameters
653
+ ----------
654
+ var : str, optional, default None
655
+ Name of the artifact in which to store the caught exception.
656
+ If not specified, the exception is not stored.
657
+ print_exception : bool, default True
658
+ Determines whether or not the exception is printed to
659
+ stdout when caught.
660
+ """
661
+ ...
662
+
663
+ @typing.overload
664
+ def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
665
+ ...
666
+
667
+ @typing.overload
668
+ def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
669
+ ...
670
+
671
+ def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
672
+ """
673
+ Specifies that the step will success under all circumstances.
674
+
675
+ The decorator will create an optional artifact, specified by `var`, which
676
+ contains the exception raised. You can use it to detect the presence
677
+ of errors, indicating that all happy-path artifacts produced by the step
678
+ are missing.
679
+
680
+
681
+ Parameters
682
+ ----------
683
+ var : str, optional, default None
684
+ Name of the artifact in which to store the caught exception.
685
+ If not specified, the exception is not stored.
686
+ print_exception : bool, default True
687
+ Determines whether or not the exception is printed to
688
+ stdout when caught.
689
+ """
690
+ ...
691
+
692
+ @typing.overload
693
+ def checkpoint(*, load_policy: str = 'fresh', temp_dir_root: str = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
694
+ """
695
+ Enables checkpointing for a step.
910
696
 
911
697
  > Examples
912
698
 
@@ -974,245 +760,208 @@ def checkpoint(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None
974
760
  ...
975
761
 
976
762
  @typing.overload
977
- def test_append_card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
978
- """
979
- A simple decorator that demonstrates using CardDecoratorInjector
980
- to inject a card and render simple markdown content.
981
- """
982
- ...
983
-
984
- @typing.overload
985
- def test_append_card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
986
- ...
987
-
988
- def test_append_card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
989
- """
990
- A simple decorator that demonstrates using CardDecoratorInjector
991
- to inject a card and render simple markdown content.
992
- """
993
- ...
994
-
995
- @typing.overload
996
- def app_deploy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
997
- """
998
- Decorator prototype for all step decorators. This function gets specialized
999
- and imported for all decorators types by _import_plugin_decorators().
1000
- """
763
+ def checkpoint(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1001
764
  ...
1002
765
 
1003
766
  @typing.overload
1004
- def app_deploy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
767
+ def checkpoint(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1005
768
  ...
1006
769
 
1007
- def app_deploy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
770
+ def checkpoint(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, load_policy: str = 'fresh', temp_dir_root: str = None):
1008
771
  """
1009
- Decorator prototype for all step decorators. This function gets specialized
1010
- and imported for all decorators types by _import_plugin_decorators().
772
+ Enables checkpointing for a step.
773
+
774
+ > Examples
775
+
776
+ - Saving Checkpoints
777
+
778
+ ```python
779
+ @checkpoint
780
+ @step
781
+ def train(self):
782
+ model = create_model(self.parameters, checkpoint_path = None)
783
+ for i in range(self.epochs):
784
+ # some training logic
785
+ loss = model.train(self.dataset)
786
+ if i % 10 == 0:
787
+ model.save(
788
+ current.checkpoint.directory,
789
+ )
790
+ # saves the contents of the `current.checkpoint.directory` as a checkpoint
791
+ # and returns a reference dictionary to the checkpoint saved in the datastore
792
+ self.latest_checkpoint = current.checkpoint.save(
793
+ name="epoch_checkpoint",
794
+ metadata={
795
+ "epoch": i,
796
+ "loss": loss,
797
+ }
798
+ )
799
+ ```
800
+
801
+ - Using Loaded Checkpoints
802
+
803
+ ```python
804
+ @retry(times=3)
805
+ @checkpoint
806
+ @step
807
+ def train(self):
808
+ # Assume that the task has restarted and the previous attempt of the task
809
+ # saved a checkpoint
810
+ checkpoint_path = None
811
+ if current.checkpoint.is_loaded: # Check if a checkpoint is loaded
812
+ print("Loaded checkpoint from the previous attempt")
813
+ checkpoint_path = current.checkpoint.directory
814
+
815
+ model = create_model(self.parameters, checkpoint_path = checkpoint_path)
816
+ for i in range(self.epochs):
817
+ ...
818
+ ```
819
+
820
+
821
+ Parameters
822
+ ----------
823
+ load_policy : str, default: "fresh"
824
+ The policy for loading the checkpoint. The following policies are supported:
825
+ - "eager": Loads the the latest available checkpoint within the namespace.
826
+ With this mode, the latest checkpoint written by any previous task (can be even a different run) of the step
827
+ will be loaded at the start of the task.
828
+ - "none": Do not load any checkpoint
829
+ - "fresh": Loads the lastest checkpoint created within the running Task.
830
+ This mode helps loading checkpoints across various retry attempts of the same task.
831
+ With this mode, no checkpoint will be loaded at the start of a task but any checkpoints
832
+ created within the task will be loaded when the task is retries execution on failure.
833
+
834
+ temp_dir_root : str, default: None
835
+ The root directory under which `current.checkpoint.directory` will be created.
1011
836
  """
1012
837
  ...
1013
838
 
1014
839
  @typing.overload
1015
- def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
840
+ def resources(*, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1016
841
  """
1017
- Creates a human-readable report, a Metaflow Card, after this step completes.
842
+ Specifies the resources needed when executing this step.
1018
843
 
1019
- Note that you may add multiple `@card` decorators in a step with different parameters.
844
+ Use `@resources` to specify the resource requirements
845
+ independently of the specific compute layer (`@batch`, `@kubernetes`).
846
+
847
+ You can choose the compute layer on the command line by executing e.g.
848
+ ```
849
+ python myflow.py run --with batch
850
+ ```
851
+ or
852
+ ```
853
+ python myflow.py run --with kubernetes
854
+ ```
855
+ which executes the flow on the desired system using the
856
+ requirements specified in `@resources`.
1020
857
 
1021
858
 
1022
859
  Parameters
1023
860
  ----------
1024
- type : str, default 'default'
1025
- Card type.
1026
- id : str, optional, default None
1027
- If multiple cards are present, use this id to identify this card.
1028
- options : Dict[str, Any], default {}
1029
- Options passed to the card. The contents depend on the card type.
1030
- timeout : int, default 45
1031
- Interrupt reporting if it takes more than this many seconds.
861
+ cpu : int, default 1
862
+ Number of CPUs required for this step.
863
+ gpu : int, optional, default None
864
+ Number of GPUs required for this step.
865
+ disk : int, optional, default None
866
+ Disk size (in MB) required for this step. Only applies on Kubernetes.
867
+ memory : int, default 4096
868
+ Memory size (in MB) required for this step.
869
+ shared_memory : int, optional, default None
870
+ The value for the size (in MiB) of the /dev/shm volume for this step.
871
+ This parameter maps to the `--shm-size` option in Docker.
1032
872
  """
1033
873
  ...
1034
874
 
1035
875
  @typing.overload
1036
- def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
876
+ def resources(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1037
877
  ...
1038
878
 
1039
879
  @typing.overload
1040
- def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
880
+ def resources(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1041
881
  ...
1042
882
 
1043
- def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
883
+ def resources(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None):
1044
884
  """
1045
- Creates a human-readable report, a Metaflow Card, after this step completes.
1046
-
1047
- Note that you may add multiple `@card` decorators in a step with different parameters.
885
+ Specifies the resources needed when executing this step.
1048
886
 
887
+ Use `@resources` to specify the resource requirements
888
+ independently of the specific compute layer (`@batch`, `@kubernetes`).
1049
889
 
1050
- Parameters
1051
- ----------
1052
- type : str, default 'default'
1053
- Card type.
1054
- id : str, optional, default None
1055
- If multiple cards are present, use this id to identify this card.
1056
- options : Dict[str, Any], default {}
1057
- Options passed to the card. The contents depend on the card type.
1058
- timeout : int, default 45
1059
- Interrupt reporting if it takes more than this many seconds.
1060
- """
1061
- ...
1062
-
1063
- def nvct(*, gpu: int, gpu_type: str) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1064
- """
1065
- Specifies that this step should execute on DGX cloud.
890
+ You can choose the compute layer on the command line by executing e.g.
891
+ ```
892
+ python myflow.py run --with batch
893
+ ```
894
+ or
895
+ ```
896
+ python myflow.py run --with kubernetes
897
+ ```
898
+ which executes the flow on the desired system using the
899
+ requirements specified in `@resources`.
1066
900
 
1067
901
 
1068
902
  Parameters
1069
903
  ----------
1070
- gpu : int
1071
- Number of GPUs to use.
1072
- gpu_type : str
1073
- Type of Nvidia GPU to use.
1074
- """
1075
- ...
1076
-
1077
- @typing.overload
1078
- def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1079
- """
1080
- Decorator prototype for all step decorators. This function gets specialized
1081
- and imported for all decorators types by _import_plugin_decorators().
1082
- """
1083
- ...
1084
-
1085
- @typing.overload
1086
- def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1087
- ...
1088
-
1089
- def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1090
- """
1091
- Decorator prototype for all step decorators. This function gets specialized
1092
- and imported for all decorators types by _import_plugin_decorators().
1093
- """
1094
- ...
1095
-
1096
- @typing.overload
1097
- def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1098
- """
1099
- Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1100
- It exists to make it easier for users to know that this decorator should only be used with
1101
- a Neo Cloud like Nebius.
1102
- """
1103
- ...
1104
-
1105
- @typing.overload
1106
- def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1107
- ...
1108
-
1109
- def nebius_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1110
- """
1111
- Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1112
- It exists to make it easier for users to know that this decorator should only be used with
1113
- a Neo Cloud like Nebius.
904
+ cpu : int, default 1
905
+ Number of CPUs required for this step.
906
+ gpu : int, optional, default None
907
+ Number of GPUs required for this step.
908
+ disk : int, optional, default None
909
+ Disk size (in MB) required for this step. Only applies on Kubernetes.
910
+ memory : int, default 4096
911
+ Memory size (in MB) required for this step.
912
+ shared_memory : int, optional, default None
913
+ The value for the size (in MiB) of the /dev/shm volume for this step.
914
+ This parameter maps to the `--shm-size` option in Docker.
1114
915
  """
1115
916
  ...
1116
917
 
1117
- def ollama(*, models: list, backend: str, force_pull: bool, cache_update_policy: str, force_cache_update: bool, debug: bool, circuit_breaker_config: dict, timeout_config: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
918
+ def vllm(*, model: str, backend: str, openai_api_server: bool, debug: bool, card_refresh_interval: int, max_retries: int, retry_alert_frequency: int, engine_args: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1118
919
  """
1119
- This decorator is used to run Ollama APIs as Metaflow task sidecars.
920
+ This decorator is used to run vllm APIs as Metaflow task sidecars.
1120
921
 
1121
922
  User code call
1122
923
  --------------
1123
- @ollama(
1124
- models=[...],
924
+ @vllm(
925
+ model="...",
1125
926
  ...
1126
927
  )
1127
928
 
1128
929
  Valid backend options
1129
930
  ---------------------
1130
931
  - 'local': Run as a separate process on the local task machine.
1131
- - (TODO) 'managed': Outerbounds hosts and selects compute provider.
1132
- - (TODO) 'remote': Spin up separate instance to serve Ollama models.
1133
932
 
1134
933
  Valid model options
1135
934
  -------------------
1136
- Any model here https://ollama.com/search, e.g. 'llama3.2', 'llama3.3'
935
+ Any HuggingFace model identifier, e.g. 'meta-llama/Llama-3.2-1B'
936
+
937
+ NOTE: vLLM's OpenAI-compatible server serves ONE model per server instance.
938
+ If you need multiple models, you must create multiple @vllm decorators.
1137
939
 
1138
940
 
1139
941
  Parameters
1140
942
  ----------
1141
- models: list[str]
1142
- List of Ollama containers running models in sidecars.
943
+ model: str
944
+ HuggingFace model identifier to be served by vLLM.
1143
945
  backend: str
1144
- Determines where and how to run the Ollama process.
1145
- force_pull: bool
1146
- Whether to run `ollama pull` no matter what, or first check the remote cache in Metaflow datastore for this model key.
1147
- cache_update_policy: str
1148
- Cache update policy: "auto", "force", or "never".
1149
- force_cache_update: bool
1150
- Simple override for "force" cache update policy.
946
+ Determines where and how to run the vLLM process.
947
+ openai_api_server: bool
948
+ Whether to use OpenAI-compatible API server mode (subprocess) instead of native engine.
949
+ Default is False (uses native engine).
950
+ Set to True for backward compatibility with existing code.
1151
951
  debug: bool
1152
952
  Whether to turn on verbose debugging logs.
1153
- circuit_breaker_config: dict
1154
- Configuration for circuit breaker protection. Keys: failure_threshold, recovery_timeout, reset_timeout.
1155
- timeout_config: dict
1156
- Configuration for various operation timeouts. Keys: pull, stop, health_check, install, server_startup.
1157
- """
1158
- ...
1159
-
1160
- @typing.overload
1161
- def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1162
- """
1163
- Specifies the Conda environment for the step.
1164
-
1165
- Information in this decorator will augment any
1166
- attributes set in the `@conda_base` flow-level decorator. Hence,
1167
- you can use `@conda_base` to set packages required by all
1168
- steps and use `@conda` to specify step-specific overrides.
1169
-
1170
-
1171
- Parameters
1172
- ----------
1173
- packages : Dict[str, str], default {}
1174
- Packages to use for this step. The key is the name of the package
1175
- and the value is the version to use.
1176
- libraries : Dict[str, str], default {}
1177
- Supported for backward compatibility. When used with packages, packages will take precedence.
1178
- python : str, optional, default None
1179
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1180
- that the version used will correspond to the version of the Python interpreter used to start the run.
1181
- disabled : bool, default False
1182
- If set to True, disables @conda.
1183
- """
1184
- ...
1185
-
1186
- @typing.overload
1187
- def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1188
- ...
1189
-
1190
- @typing.overload
1191
- def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1192
- ...
1193
-
1194
- def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1195
- """
1196
- Specifies the Conda environment for the step.
1197
-
1198
- Information in this decorator will augment any
1199
- attributes set in the `@conda_base` flow-level decorator. Hence,
1200
- you can use `@conda_base` to set packages required by all
1201
- steps and use `@conda` to specify step-specific overrides.
1202
-
1203
-
1204
- Parameters
1205
- ----------
1206
- packages : Dict[str, str], default {}
1207
- Packages to use for this step. The key is the name of the package
1208
- and the value is the version to use.
1209
- libraries : Dict[str, str], default {}
1210
- Supported for backward compatibility. When used with packages, packages will take precedence.
1211
- python : str, optional, default None
1212
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1213
- that the version used will correspond to the version of the Python interpreter used to start the run.
1214
- disabled : bool, default False
1215
- If set to True, disables @conda.
953
+ card_refresh_interval: int
954
+ Interval in seconds for refreshing the vLLM status card.
955
+ Only used when openai_api_server=True.
956
+ max_retries: int
957
+ Maximum number of retries checking for vLLM server startup.
958
+ Only used when openai_api_server=True.
959
+ retry_alert_frequency: int
960
+ Frequency of alert logs for vLLM server startup retries.
961
+ Only used when openai_api_server=True.
962
+ engine_args : dict
963
+ Additional keyword arguments to pass to the vLLM engine.
964
+ For example, `tensor_parallel_size=2`.
1216
965
  """
1217
966
  ...
1218
967
 
@@ -1255,23 +1004,6 @@ def secrets(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None],
1255
1004
  """
1256
1005
  ...
1257
1006
 
1258
- @typing.overload
1259
- def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1260
- """
1261
- Internal decorator to support Fast bakery
1262
- """
1263
- ...
1264
-
1265
- @typing.overload
1266
- def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1267
- ...
1268
-
1269
- def fast_bakery_internal(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1270
- """
1271
- Internal decorator to support Fast bakery
1272
- """
1273
- ...
1274
-
1275
1007
  @typing.overload
1276
1008
  def coreweave_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1277
1009
  """
@@ -1294,414 +1026,461 @@ def coreweave_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFla
1294
1026
  ...
1295
1027
 
1296
1028
  @typing.overload
1297
- def retry(*, times: int = 3, minutes_between_retries: int = 2) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1029
+ def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1298
1030
  """
1299
- Specifies the number of times the task corresponding
1300
- to a step needs to be retried.
1301
-
1302
- This decorator is useful for handling transient errors, such as networking issues.
1303
- If your task contains operations that can't be retried safely, e.g. database updates,
1304
- it is advisable to annotate it with `@retry(times=0)`.
1031
+ Specifies the Conda environment for the step.
1305
1032
 
1306
- This can be used in conjunction with the `@catch` decorator. The `@catch`
1307
- decorator will execute a no-op task after all retries have been exhausted,
1308
- ensuring that the flow execution can continue.
1033
+ Information in this decorator will augment any
1034
+ attributes set in the `@conda_base` flow-level decorator. Hence,
1035
+ you can use `@conda_base` to set packages required by all
1036
+ steps and use `@conda` to specify step-specific overrides.
1309
1037
 
1310
1038
 
1311
1039
  Parameters
1312
1040
  ----------
1313
- times : int, default 3
1314
- Number of times to retry this task.
1315
- minutes_between_retries : int, default 2
1316
- Number of minutes between retries.
1041
+ packages : Dict[str, str], default {}
1042
+ Packages to use for this step. The key is the name of the package
1043
+ and the value is the version to use.
1044
+ libraries : Dict[str, str], default {}
1045
+ Supported for backward compatibility. When used with packages, packages will take precedence.
1046
+ python : str, optional, default None
1047
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1048
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1049
+ disabled : bool, default False
1050
+ If set to True, disables @conda.
1317
1051
  """
1318
1052
  ...
1319
1053
 
1320
1054
  @typing.overload
1321
- def retry(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1055
+ def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1322
1056
  ...
1323
1057
 
1324
1058
  @typing.overload
1325
- def retry(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1059
+ def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1326
1060
  ...
1327
1061
 
1328
- def retry(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, times: int = 3, minutes_between_retries: int = 2):
1062
+ def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1329
1063
  """
1330
- Specifies the number of times the task corresponding
1331
- to a step needs to be retried.
1332
-
1333
- This decorator is useful for handling transient errors, such as networking issues.
1334
- If your task contains operations that can't be retried safely, e.g. database updates,
1335
- it is advisable to annotate it with `@retry(times=0)`.
1064
+ Specifies the Conda environment for the step.
1336
1065
 
1337
- This can be used in conjunction with the `@catch` decorator. The `@catch`
1338
- decorator will execute a no-op task after all retries have been exhausted,
1339
- ensuring that the flow execution can continue.
1066
+ Information in this decorator will augment any
1067
+ attributes set in the `@conda_base` flow-level decorator. Hence,
1068
+ you can use `@conda_base` to set packages required by all
1069
+ steps and use `@conda` to specify step-specific overrides.
1340
1070
 
1341
1071
 
1342
1072
  Parameters
1343
1073
  ----------
1344
- times : int, default 3
1345
- Number of times to retry this task.
1346
- minutes_between_retries : int, default 2
1347
- Number of minutes between retries.
1074
+ packages : Dict[str, str], default {}
1075
+ Packages to use for this step. The key is the name of the package
1076
+ and the value is the version to use.
1077
+ libraries : Dict[str, str], default {}
1078
+ Supported for backward compatibility. When used with packages, packages will take precedence.
1079
+ python : str, optional, default None
1080
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1081
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1082
+ disabled : bool, default False
1083
+ If set to True, disables @conda.
1348
1084
  """
1349
1085
  ...
1350
1086
 
1351
- @typing.overload
1352
- def pypi_base(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1087
+ def nvidia(*, gpu: int, gpu_type: str, queue_timeout: int) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1353
1088
  """
1354
- Specifies the PyPI packages for all steps of the flow.
1089
+ Specifies that this step should execute on DGX cloud.
1355
1090
 
1356
- Use `@pypi_base` to set common packages required by all
1357
- steps and use `@pypi` to specify step-specific overrides.
1358
1091
 
1359
1092
  Parameters
1360
1093
  ----------
1361
- packages : Dict[str, str], default: {}
1362
- Packages to use for this flow. The key is the name of the package
1363
- and the value is the version to use.
1364
- python : str, optional, default: None
1365
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1366
- that the version used will correspond to the version of the Python interpreter used to start the run.
1367
- """
1368
- ...
1369
-
1370
- @typing.overload
1371
- def pypi_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1094
+ gpu : int
1095
+ Number of GPUs to use.
1096
+ gpu_type : str
1097
+ Type of Nvidia GPU to use.
1098
+ queue_timeout : int
1099
+ Time to keep the job in NVCF's queue.
1100
+ """
1372
1101
  ...
1373
1102
 
1374
- def pypi_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
1103
+ def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: typing.Optional[str] = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.Optional[typing.List[str]] = None, node_selector: typing.Union[typing.Dict[str, str], str, None] = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: typing.Optional[int] = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = '/metaflow_temp', persistent_volume_claims: typing.Optional[typing.Dict[str, str]] = None, shared_memory: typing.Optional[int] = None, port: typing.Optional[int] = None, compute_pool: typing.Optional[str] = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Optional[typing.Dict[str, typing.Any]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1375
1104
  """
1376
- Specifies the PyPI packages for all steps of the flow.
1105
+ Specifies that this step should execute on Kubernetes.
1377
1106
 
1378
- Use `@pypi_base` to set common packages required by all
1379
- steps and use `@pypi` to specify step-specific overrides.
1380
1107
 
1381
1108
  Parameters
1382
1109
  ----------
1383
- packages : Dict[str, str], default: {}
1384
- Packages to use for this flow. The key is the name of the package
1385
- and the value is the version to use.
1386
- python : str, optional, default: None
1387
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1388
- that the version used will correspond to the version of the Python interpreter used to start the run.
1110
+ cpu : int, default 1
1111
+ Number of CPUs required for this step. If `@resources` is
1112
+ also present, the maximum value from all decorators is used.
1113
+ memory : int, default 4096
1114
+ Memory size (in MB) required for this step. If
1115
+ `@resources` is also present, the maximum value from all decorators is
1116
+ used.
1117
+ disk : int, default 10240
1118
+ Disk size (in MB) required for this step. If
1119
+ `@resources` is also present, the maximum value from all decorators is
1120
+ used.
1121
+ image : str, optional, default None
1122
+ Docker image to use when launching on Kubernetes. If not specified, and
1123
+ METAFLOW_KUBERNETES_CONTAINER_IMAGE is specified, that image is used. If
1124
+ not, a default Docker image mapping to the current version of Python is used.
1125
+ image_pull_policy: str, default KUBERNETES_IMAGE_PULL_POLICY
1126
+ If given, the imagePullPolicy to be applied to the Docker image of the step.
1127
+ image_pull_secrets: List[str], default []
1128
+ The default is extracted from METAFLOW_KUBERNETES_IMAGE_PULL_SECRETS.
1129
+ Kubernetes image pull secrets to use when pulling container images
1130
+ in Kubernetes.
1131
+ service_account : str, default METAFLOW_KUBERNETES_SERVICE_ACCOUNT
1132
+ Kubernetes service account to use when launching pod in Kubernetes.
1133
+ secrets : List[str], optional, default None
1134
+ Kubernetes secrets to use when launching pod in Kubernetes. These
1135
+ secrets are in addition to the ones defined in `METAFLOW_KUBERNETES_SECRETS`
1136
+ in Metaflow configuration.
1137
+ node_selector: Union[Dict[str,str], str], optional, default None
1138
+ Kubernetes node selector(s) to apply to the pod running the task.
1139
+ Can be passed in as a comma separated string of values e.g.
1140
+ 'kubernetes.io/os=linux,kubernetes.io/arch=amd64' or as a dictionary
1141
+ {'kubernetes.io/os': 'linux', 'kubernetes.io/arch': 'amd64'}
1142
+ namespace : str, default METAFLOW_KUBERNETES_NAMESPACE
1143
+ Kubernetes namespace to use when launching pod in Kubernetes.
1144
+ gpu : int, optional, default None
1145
+ Number of GPUs required for this step. A value of zero implies that
1146
+ the scheduled node should not have GPUs.
1147
+ gpu_vendor : str, default KUBERNETES_GPU_VENDOR
1148
+ The vendor of the GPUs to be used for this step.
1149
+ tolerations : List[Dict[str,str]], default []
1150
+ The default is extracted from METAFLOW_KUBERNETES_TOLERATIONS.
1151
+ Kubernetes tolerations to use when launching pod in Kubernetes.
1152
+ labels: Dict[str, str], default: METAFLOW_KUBERNETES_LABELS
1153
+ Kubernetes labels to use when launching pod in Kubernetes.
1154
+ annotations: Dict[str, str], default: METAFLOW_KUBERNETES_ANNOTATIONS
1155
+ Kubernetes annotations to use when launching pod in Kubernetes.
1156
+ use_tmpfs : bool, default False
1157
+ This enables an explicit tmpfs mount for this step.
1158
+ tmpfs_tempdir : bool, default True
1159
+ sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
1160
+ tmpfs_size : int, optional, default: None
1161
+ The value for the size (in MiB) of the tmpfs mount for this step.
1162
+ This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
1163
+ memory allocated for this step.
1164
+ tmpfs_path : str, optional, default /metaflow_temp
1165
+ Path to tmpfs mount for this step.
1166
+ persistent_volume_claims : Dict[str, str], optional, default None
1167
+ A map (dictionary) of persistent volumes to be mounted to the pod for this step. The map is from persistent
1168
+ volumes to the path to which the volume is to be mounted, e.g., `{'pvc-name': '/path/to/mount/on'}`.
1169
+ shared_memory: int, optional
1170
+ Shared memory size (in MiB) required for this step
1171
+ port: int, optional
1172
+ Port number to specify in the Kubernetes job object
1173
+ compute_pool : str, optional, default None
1174
+ Compute pool to be used for for this step.
1175
+ If not specified, any accessible compute pool within the perimeter is used.
1176
+ hostname_resolution_timeout: int, default 10 * 60
1177
+ Timeout in seconds for the workers tasks in the gang scheduled cluster to resolve the hostname of control task.
1178
+ Only applicable when @parallel is used.
1179
+ qos: str, default: Burstable
1180
+ Quality of Service class to assign to the pod. Supported values are: Guaranteed, Burstable, BestEffort
1181
+
1182
+ security_context: Dict[str, Any], optional, default None
1183
+ Container security context. Applies to the task container. Allows the following keys:
1184
+ - privileged: bool, optional, default None
1185
+ - allow_privilege_escalation: bool, optional, default None
1186
+ - run_as_user: int, optional, default None
1187
+ - run_as_group: int, optional, default None
1188
+ - run_as_non_root: bool, optional, default None
1389
1189
  """
1390
1190
  ...
1391
1191
 
1392
1192
  @typing.overload
1393
- def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1193
+ def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1394
1194
  """
1395
- Specifies the times when the flow should be run when running on a
1396
- production scheduler.
1397
-
1398
-
1399
- Parameters
1400
- ----------
1401
- hourly : bool, default False
1402
- Run the workflow hourly.
1403
- daily : bool, default True
1404
- Run the workflow daily.
1405
- weekly : bool, default False
1406
- Run the workflow weekly.
1407
- cron : str, optional, default None
1408
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1409
- specified by this expression.
1410
- timezone : str, optional, default None
1411
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1412
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1195
+ Decorator prototype for all step decorators. This function gets specialized
1196
+ and imported for all decorators types by _import_plugin_decorators().
1413
1197
  """
1414
1198
  ...
1415
1199
 
1416
1200
  @typing.overload
1417
- def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1201
+ def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1418
1202
  ...
1419
1203
 
1420
- def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1204
+ def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1421
1205
  """
1422
- Specifies the times when the flow should be run when running on a
1423
- production scheduler.
1206
+ Decorator prototype for all step decorators. This function gets specialized
1207
+ and imported for all decorators types by _import_plugin_decorators().
1208
+ """
1209
+ ...
1210
+
1211
+ @typing.overload
1212
+ def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1213
+ """
1214
+ Creates a human-readable report, a Metaflow Card, after this step completes.
1215
+
1216
+ Note that you may add multiple `@card` decorators in a step with different parameters.
1424
1217
 
1425
1218
 
1426
1219
  Parameters
1427
1220
  ----------
1428
- hourly : bool, default False
1429
- Run the workflow hourly.
1430
- daily : bool, default True
1431
- Run the workflow daily.
1432
- weekly : bool, default False
1433
- Run the workflow weekly.
1434
- cron : str, optional, default None
1435
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1436
- specified by this expression.
1437
- timezone : str, optional, default None
1438
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1439
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1221
+ type : str, default 'default'
1222
+ Card type.
1223
+ id : str, optional, default None
1224
+ If multiple cards are present, use this id to identify this card.
1225
+ options : Dict[str, Any], default {}
1226
+ Options passed to the card. The contents depend on the card type.
1227
+ timeout : int, default 45
1228
+ Interrupt reporting if it takes more than this many seconds.
1440
1229
  """
1441
1230
  ...
1442
1231
 
1443
- def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1232
+ @typing.overload
1233
+ def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1234
+ ...
1235
+
1236
+ @typing.overload
1237
+ def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1238
+ ...
1239
+
1240
+ def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
1444
1241
  """
1445
- Specifies what flows belong to the same project.
1242
+ Creates a human-readable report, a Metaflow Card, after this step completes.
1446
1243
 
1447
- A project-specific namespace is created for all flows that
1448
- use the same `@project(name)`.
1244
+ Note that you may add multiple `@card` decorators in a step with different parameters.
1449
1245
 
1450
1246
 
1451
1247
  Parameters
1452
1248
  ----------
1453
- name : str
1454
- Project name. Make sure that the name is unique amongst all
1455
- projects that use the same production scheduler. The name may
1456
- contain only lowercase alphanumeric characters and underscores.
1457
-
1458
- branch : Optional[str], default None
1459
- The branch to use. If not specified, the branch is set to
1460
- `user.<username>` unless `production` is set to `True`. This can
1461
- also be set on the command line using `--branch` as a top-level option.
1462
- It is an error to specify `branch` in the decorator and on the command line.
1463
-
1464
- production : bool, default False
1465
- Whether or not the branch is the production branch. This can also be set on the
1466
- command line using `--production` as a top-level option. It is an error to specify
1467
- `production` in the decorator and on the command line.
1468
- The project branch name will be:
1469
- - if `branch` is specified:
1470
- - if `production` is True: `prod.<branch>`
1471
- - if `production` is False: `test.<branch>`
1472
- - if `branch` is not specified:
1473
- - if `production` is True: `prod`
1474
- - if `production` is False: `user.<username>`
1249
+ type : str, default 'default'
1250
+ Card type.
1251
+ id : str, optional, default None
1252
+ If multiple cards are present, use this id to identify this card.
1253
+ options : Dict[str, Any], default {}
1254
+ Options passed to the card. The contents depend on the card type.
1255
+ timeout : int, default 45
1256
+ Interrupt reporting if it takes more than this many seconds.
1475
1257
  """
1476
1258
  ...
1477
1259
 
1478
- def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1260
+ @typing.overload
1261
+ def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1479
1262
  """
1480
- The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1481
- before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1482
- and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1483
- added as a flow decorators. Adding more than one decorator will ensure that `start` step
1484
- starts only after all sensors finish.
1263
+ Internal decorator to support Fast bakery
1264
+ """
1265
+ ...
1266
+
1267
+ @typing.overload
1268
+ def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1269
+ ...
1270
+
1271
+ def fast_bakery_internal(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1272
+ """
1273
+ Internal decorator to support Fast bakery
1274
+ """
1275
+ ...
1276
+
1277
+ @typing.overload
1278
+ def retry(*, times: int = 3, minutes_between_retries: int = 2) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1279
+ """
1280
+ Specifies the number of times the task corresponding
1281
+ to a step needs to be retried.
1282
+
1283
+ This decorator is useful for handling transient errors, such as networking issues.
1284
+ If your task contains operations that can't be retried safely, e.g. database updates,
1285
+ it is advisable to annotate it with `@retry(times=0)`.
1286
+
1287
+ This can be used in conjunction with the `@catch` decorator. The `@catch`
1288
+ decorator will execute a no-op task after all retries have been exhausted,
1289
+ ensuring that the flow execution can continue.
1485
1290
 
1486
1291
 
1487
1292
  Parameters
1488
1293
  ----------
1489
- timeout : int
1490
- Time, in seconds before the task times out and fails. (Default: 3600)
1491
- poke_interval : int
1492
- Time in seconds that the job should wait in between each try. (Default: 60)
1493
- mode : str
1494
- How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1495
- exponential_backoff : bool
1496
- allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1497
- pool : str
1498
- the slot pool this task should run in,
1499
- slot pools are a way to limit concurrency for certain tasks. (Default:None)
1500
- soft_fail : bool
1501
- Set to true to mark the task as SKIPPED on failure. (Default: False)
1502
- name : str
1503
- Name of the sensor on Airflow
1504
- description : str
1505
- Description of sensor in the Airflow UI
1506
- bucket_key : Union[str, List[str]]
1507
- The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1508
- When it's specified as a full s3:// url, please leave `bucket_name` as None
1509
- bucket_name : str
1510
- Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1511
- When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1512
- wildcard_match : bool
1513
- whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1514
- aws_conn_id : str
1515
- a reference to the s3 connection on Airflow. (Default: None)
1516
- verify : bool
1517
- Whether or not to verify SSL certificates for S3 connection. (Default: None)
1294
+ times : int, default 3
1295
+ Number of times to retry this task.
1296
+ minutes_between_retries : int, default 2
1297
+ Number of minutes between retries.
1518
1298
  """
1519
1299
  ...
1520
1300
 
1521
1301
  @typing.overload
1522
- def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1302
+ def retry(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1303
+ ...
1304
+
1305
+ @typing.overload
1306
+ def retry(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1307
+ ...
1308
+
1309
+ def retry(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, times: int = 3, minutes_between_retries: int = 2):
1523
1310
  """
1524
- Specifies the event(s) that this flow depends on.
1525
-
1526
- ```
1527
- @trigger(event='foo')
1528
- ```
1529
- or
1530
- ```
1531
- @trigger(events=['foo', 'bar'])
1532
- ```
1311
+ Specifies the number of times the task corresponding
1312
+ to a step needs to be retried.
1533
1313
 
1534
- Additionally, you can specify the parameter mappings
1535
- to map event payload to Metaflow parameters for the flow.
1536
- ```
1537
- @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
1538
- ```
1539
- or
1540
- ```
1541
- @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
1542
- {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
1543
- ```
1314
+ This decorator is useful for handling transient errors, such as networking issues.
1315
+ If your task contains operations that can't be retried safely, e.g. database updates,
1316
+ it is advisable to annotate it with `@retry(times=0)`.
1544
1317
 
1545
- 'parameters' can also be a list of strings and tuples like so:
1546
- ```
1547
- @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
1548
- ```
1549
- This is equivalent to:
1550
- ```
1551
- @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
1552
- ```
1318
+ This can be used in conjunction with the `@catch` decorator. The `@catch`
1319
+ decorator will execute a no-op task after all retries have been exhausted,
1320
+ ensuring that the flow execution can continue.
1553
1321
 
1554
1322
 
1555
1323
  Parameters
1556
1324
  ----------
1557
- event : Union[str, Dict[str, Any]], optional, default None
1558
- Event dependency for this flow.
1559
- events : List[Union[str, Dict[str, Any]]], default []
1560
- Events dependency for this flow.
1561
- options : Dict[str, Any], default {}
1562
- Backend-specific configuration for tuning eventing behavior.
1325
+ times : int, default 3
1326
+ Number of times to retry this task.
1327
+ minutes_between_retries : int, default 2
1328
+ Number of minutes between retries.
1563
1329
  """
1564
1330
  ...
1565
1331
 
1566
1332
  @typing.overload
1567
- def trigger(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1333
+ def app_deploy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1334
+ """
1335
+ Decorator prototype for all step decorators. This function gets specialized
1336
+ and imported for all decorators types by _import_plugin_decorators().
1337
+ """
1568
1338
  ...
1569
1339
 
1570
- def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}):
1340
+ @typing.overload
1341
+ def app_deploy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1342
+ ...
1343
+
1344
+ def app_deploy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1571
1345
  """
1572
- Specifies the event(s) that this flow depends on.
1346
+ Decorator prototype for all step decorators. This function gets specialized
1347
+ and imported for all decorators types by _import_plugin_decorators().
1348
+ """
1349
+ ...
1350
+
1351
+ @typing.overload
1352
+ def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1353
+ """
1354
+ Specifies the flow(s) that this flow depends on.
1573
1355
 
1574
1356
  ```
1575
- @trigger(event='foo')
1357
+ @trigger_on_finish(flow='FooFlow')
1576
1358
  ```
1577
1359
  or
1578
1360
  ```
1579
- @trigger(events=['foo', 'bar'])
1361
+ @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1580
1362
  ```
1363
+ This decorator respects the @project decorator and triggers the flow
1364
+ when upstream runs within the same namespace complete successfully
1581
1365
 
1582
- Additionally, you can specify the parameter mappings
1583
- to map event payload to Metaflow parameters for the flow.
1366
+ Additionally, you can specify project aware upstream flow dependencies
1367
+ by specifying the fully qualified project_flow_name.
1584
1368
  ```
1585
- @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
1369
+ @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1586
1370
  ```
1587
1371
  or
1588
1372
  ```
1589
- @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
1590
- {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
1373
+ @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1591
1374
  ```
1592
1375
 
1593
- 'parameters' can also be a list of strings and tuples like so:
1594
- ```
1595
- @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
1596
- ```
1597
- This is equivalent to:
1376
+ You can also specify just the project or project branch (other values will be
1377
+ inferred from the current project or project branch):
1598
1378
  ```
1599
- @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
1379
+ @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1600
1380
  ```
1601
1381
 
1602
-
1603
- Parameters
1604
- ----------
1605
- event : Union[str, Dict[str, Any]], optional, default None
1606
- Event dependency for this flow.
1607
- events : List[Union[str, Dict[str, Any]]], default []
1608
- Events dependency for this flow.
1609
- options : Dict[str, Any], default {}
1610
- Backend-specific configuration for tuning eventing behavior.
1611
- """
1612
- ...
1613
-
1614
- def airflow_external_task_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, external_dag_id: str, external_task_ids: typing.List[str], allowed_states: typing.List[str], failed_states: typing.List[str], execution_delta: "datetime.timedelta", check_existence: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1615
- """
1616
- The `@airflow_external_task_sensor` decorator attaches a Airflow [ExternalTaskSensor](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/sensors/external_task/index.html#airflow.sensors.external_task.ExternalTaskSensor) before the start step of the flow.
1617
- This decorator only works when a flow is scheduled on Airflow and is compiled using `airflow create`. More than one `@airflow_external_task_sensor` can be added as a flow decorators. Adding more than one decorator will ensure that `start` step starts only after all sensors finish.
1618
-
1619
-
1620
- Parameters
1621
- ----------
1622
- timeout : int
1623
- Time, in seconds before the task times out and fails. (Default: 3600)
1624
- poke_interval : int
1625
- Time in seconds that the job should wait in between each try. (Default: 60)
1626
- mode : str
1627
- How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1628
- exponential_backoff : bool
1629
- allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1630
- pool : str
1631
- the slot pool this task should run in,
1632
- slot pools are a way to limit concurrency for certain tasks. (Default:None)
1633
- soft_fail : bool
1634
- Set to true to mark the task as SKIPPED on failure. (Default: False)
1635
- name : str
1636
- Name of the sensor on Airflow
1637
- description : str
1638
- Description of sensor in the Airflow UI
1639
- external_dag_id : str
1640
- The dag_id that contains the task you want to wait for.
1641
- external_task_ids : List[str]
1642
- The list of task_ids that you want to wait for.
1643
- If None (default value) the sensor waits for the DAG. (Default: None)
1644
- allowed_states : List[str]
1645
- Iterable of allowed states, (Default: ['success'])
1646
- failed_states : List[str]
1647
- Iterable of failed or dis-allowed states. (Default: None)
1648
- execution_delta : datetime.timedelta
1649
- time difference with the previous execution to look at,
1650
- the default is the same logical date as the current task or DAG. (Default: None)
1651
- check_existence: bool
1652
- Set to True to check if the external task exists or check if
1653
- the DAG to wait for exists. (Default: True)
1382
+ Note that `branch` is typically one of:
1383
+ - `prod`
1384
+ - `user.bob`
1385
+ - `test.my_experiment`
1386
+ - `prod.staging`
1387
+
1388
+
1389
+ Parameters
1390
+ ----------
1391
+ flow : Union[str, Dict[str, str]], optional, default None
1392
+ Upstream flow dependency for this flow.
1393
+ flows : List[Union[str, Dict[str, str]]], default []
1394
+ Upstream flow dependencies for this flow.
1395
+ options : Dict[str, Any], default {}
1396
+ Backend-specific configuration for tuning eventing behavior.
1654
1397
  """
1655
1398
  ...
1656
1399
 
1657
1400
  @typing.overload
1658
- def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1401
+ def trigger_on_finish(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1402
+ ...
1403
+
1404
+ def trigger_on_finish(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}):
1659
1405
  """
1660
- Specifies the Conda environment for all steps of the flow.
1406
+ Specifies the flow(s) that this flow depends on.
1661
1407
 
1662
- Use `@conda_base` to set common libraries required by all
1663
- steps and use `@conda` to specify step-specific additions.
1408
+ ```
1409
+ @trigger_on_finish(flow='FooFlow')
1410
+ ```
1411
+ or
1412
+ ```
1413
+ @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1414
+ ```
1415
+ This decorator respects the @project decorator and triggers the flow
1416
+ when upstream runs within the same namespace complete successfully
1417
+
1418
+ Additionally, you can specify project aware upstream flow dependencies
1419
+ by specifying the fully qualified project_flow_name.
1420
+ ```
1421
+ @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1422
+ ```
1423
+ or
1424
+ ```
1425
+ @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1426
+ ```
1427
+
1428
+ You can also specify just the project or project branch (other values will be
1429
+ inferred from the current project or project branch):
1430
+ ```
1431
+ @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1432
+ ```
1433
+
1434
+ Note that `branch` is typically one of:
1435
+ - `prod`
1436
+ - `user.bob`
1437
+ - `test.my_experiment`
1438
+ - `prod.staging`
1664
1439
 
1665
1440
 
1666
1441
  Parameters
1667
1442
  ----------
1668
- packages : Dict[str, str], default {}
1669
- Packages to use for this flow. The key is the name of the package
1670
- and the value is the version to use.
1671
- libraries : Dict[str, str], default {}
1672
- Supported for backward compatibility. When used with packages, packages will take precedence.
1673
- python : str, optional, default None
1674
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1675
- that the version used will correspond to the version of the Python interpreter used to start the run.
1676
- disabled : bool, default False
1677
- If set to True, disables Conda.
1443
+ flow : Union[str, Dict[str, str]], optional, default None
1444
+ Upstream flow dependency for this flow.
1445
+ flows : List[Union[str, Dict[str, str]]], default []
1446
+ Upstream flow dependencies for this flow.
1447
+ options : Dict[str, Any], default {}
1448
+ Backend-specific configuration for tuning eventing behavior.
1678
1449
  """
1679
1450
  ...
1680
1451
 
1681
- @typing.overload
1682
- def conda_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1683
- ...
1684
-
1685
- def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1452
+ def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1686
1453
  """
1687
- Specifies the Conda environment for all steps of the flow.
1454
+ Specifies what flows belong to the same project.
1688
1455
 
1689
- Use `@conda_base` to set common libraries required by all
1690
- steps and use `@conda` to specify step-specific additions.
1456
+ A project-specific namespace is created for all flows that
1457
+ use the same `@project(name)`.
1691
1458
 
1692
1459
 
1693
1460
  Parameters
1694
1461
  ----------
1695
- packages : Dict[str, str], default {}
1696
- Packages to use for this flow. The key is the name of the package
1697
- and the value is the version to use.
1698
- libraries : Dict[str, str], default {}
1699
- Supported for backward compatibility. When used with packages, packages will take precedence.
1700
- python : str, optional, default None
1701
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1702
- that the version used will correspond to the version of the Python interpreter used to start the run.
1703
- disabled : bool, default False
1704
- If set to True, disables Conda.
1462
+ name : str
1463
+ Project name. Make sure that the name is unique amongst all
1464
+ projects that use the same production scheduler. The name may
1465
+ contain only lowercase alphanumeric characters and underscores.
1466
+
1467
+ branch : Optional[str], default None
1468
+ The branch to use. If not specified, the branch is set to
1469
+ `user.<username>` unless `production` is set to `True`. This can
1470
+ also be set on the command line using `--branch` as a top-level option.
1471
+ It is an error to specify `branch` in the decorator and on the command line.
1472
+
1473
+ production : bool, default False
1474
+ Whether or not the branch is the production branch. This can also be set on the
1475
+ command line using `--production` as a top-level option. It is an error to specify
1476
+ `production` in the decorator and on the command line.
1477
+ The project branch name will be:
1478
+ - if `branch` is specified:
1479
+ - if `production` is True: `prod.<branch>`
1480
+ - if `production` is False: `test.<branch>`
1481
+ - if `branch` is not specified:
1482
+ - if `production` is True: `prod`
1483
+ - if `production` is False: `user.<username>`
1705
1484
  """
1706
1485
  ...
1707
1486
 
@@ -1769,156 +1548,377 @@ def with_artifact_store(f: typing.Optional[typing.Type[FlowSpecDerived]] = None)
1769
1548
 
1770
1549
  ```
1771
1550
 
1772
- - Accessing objects stored in external datastores after task execution.
1551
+ - Accessing objects stored in external datastores after task execution.
1552
+
1553
+ ```python
1554
+ run = Run("CheckpointsTestsFlow/8992")
1555
+ with artifact_store_from(run=run, config={
1556
+ "client_params": {
1557
+ "aws_access_key_id": os.environ.get("MY_CUSTOM_ACCESS_KEY"),
1558
+ "aws_secret_access_key": os.environ.get("MY_CUSTOM_SECRET_KEY"),
1559
+ },
1560
+ }):
1561
+ with Checkpoint() as cp:
1562
+ latest = cp.list(
1563
+ task=run["start"].task
1564
+ )[0]
1565
+ print(latest)
1566
+ cp.load(
1567
+ latest,
1568
+ "test-checkpoints"
1569
+ )
1570
+
1571
+ task = Task("TorchTuneFlow/8484/train/53673")
1572
+ with artifact_store_from(run=run, config={
1573
+ "client_params": {
1574
+ "aws_access_key_id": os.environ.get("MY_CUSTOM_ACCESS_KEY"),
1575
+ "aws_secret_access_key": os.environ.get("MY_CUSTOM_SECRET_KEY"),
1576
+ },
1577
+ }):
1578
+ load_model(
1579
+ task.data.model_ref,
1580
+ "test-models"
1581
+ )
1582
+ ```
1583
+ Parameters:
1584
+ ----------
1585
+
1586
+ type: str
1587
+ The type of the datastore. Can be one of 's3', 'gcs', 'azure' or any other supported metaflow Datastore.
1588
+
1589
+ config: dict or Callable
1590
+ Dictionary of configuration options for the datastore. The following keys are required:
1591
+ - root: The root path in the datastore where the data will be saved. (needs to be in the format expected by the datastore)
1592
+ - example: 's3://bucket-name/path/to/root'
1593
+ - example: 'gs://bucket-name/path/to/root'
1594
+ - example: 'https://myblockacc.blob.core.windows.net/metaflow/'
1595
+ - role_arn (optional): AWS IAM role to access s3 bucket (only when `type` is 's3')
1596
+ - session_vars (optional): AWS session variables to access s3 bucket (only when `type` is 's3')
1597
+ - client_params (optional): AWS client parameters to access s3 bucket (only when `type` is 's3')
1598
+ """
1599
+ ...
1600
+
1601
+ def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1602
+ """
1603
+ The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1604
+ before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1605
+ and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1606
+ added as a flow decorators. Adding more than one decorator will ensure that `start` step
1607
+ starts only after all sensors finish.
1608
+
1609
+
1610
+ Parameters
1611
+ ----------
1612
+ timeout : int
1613
+ Time, in seconds before the task times out and fails. (Default: 3600)
1614
+ poke_interval : int
1615
+ Time in seconds that the job should wait in between each try. (Default: 60)
1616
+ mode : str
1617
+ How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1618
+ exponential_backoff : bool
1619
+ allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1620
+ pool : str
1621
+ the slot pool this task should run in,
1622
+ slot pools are a way to limit concurrency for certain tasks. (Default:None)
1623
+ soft_fail : bool
1624
+ Set to true to mark the task as SKIPPED on failure. (Default: False)
1625
+ name : str
1626
+ Name of the sensor on Airflow
1627
+ description : str
1628
+ Description of sensor in the Airflow UI
1629
+ bucket_key : Union[str, List[str]]
1630
+ The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1631
+ When it's specified as a full s3:// url, please leave `bucket_name` as None
1632
+ bucket_name : str
1633
+ Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1634
+ When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1635
+ wildcard_match : bool
1636
+ whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1637
+ aws_conn_id : str
1638
+ a reference to the s3 connection on Airflow. (Default: None)
1639
+ verify : bool
1640
+ Whether or not to verify SSL certificates for S3 connection. (Default: None)
1641
+ """
1642
+ ...
1643
+
1644
+ @typing.overload
1645
+ def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1646
+ """
1647
+ Specifies the times when the flow should be run when running on a
1648
+ production scheduler.
1649
+
1650
+
1651
+ Parameters
1652
+ ----------
1653
+ hourly : bool, default False
1654
+ Run the workflow hourly.
1655
+ daily : bool, default True
1656
+ Run the workflow daily.
1657
+ weekly : bool, default False
1658
+ Run the workflow weekly.
1659
+ cron : str, optional, default None
1660
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1661
+ specified by this expression.
1662
+ timezone : str, optional, default None
1663
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1664
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1665
+ """
1666
+ ...
1667
+
1668
+ @typing.overload
1669
+ def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1670
+ ...
1671
+
1672
+ def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1673
+ """
1674
+ Specifies the times when the flow should be run when running on a
1675
+ production scheduler.
1676
+
1677
+
1678
+ Parameters
1679
+ ----------
1680
+ hourly : bool, default False
1681
+ Run the workflow hourly.
1682
+ daily : bool, default True
1683
+ Run the workflow daily.
1684
+ weekly : bool, default False
1685
+ Run the workflow weekly.
1686
+ cron : str, optional, default None
1687
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1688
+ specified by this expression.
1689
+ timezone : str, optional, default None
1690
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1691
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1692
+ """
1693
+ ...
1694
+
1695
+ @typing.overload
1696
+ def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1697
+ """
1698
+ Specifies the Conda environment for all steps of the flow.
1699
+
1700
+ Use `@conda_base` to set common libraries required by all
1701
+ steps and use `@conda` to specify step-specific additions.
1773
1702
 
1774
- ```python
1775
- run = Run("CheckpointsTestsFlow/8992")
1776
- with artifact_store_from(run=run, config={
1777
- "client_params": {
1778
- "aws_access_key_id": os.environ.get("MY_CUSTOM_ACCESS_KEY"),
1779
- "aws_secret_access_key": os.environ.get("MY_CUSTOM_SECRET_KEY"),
1780
- },
1781
- }):
1782
- with Checkpoint() as cp:
1783
- latest = cp.list(
1784
- task=run["start"].task
1785
- )[0]
1786
- print(latest)
1787
- cp.load(
1788
- latest,
1789
- "test-checkpoints"
1790
- )
1791
1703
 
1792
- task = Task("TorchTuneFlow/8484/train/53673")
1793
- with artifact_store_from(run=run, config={
1794
- "client_params": {
1795
- "aws_access_key_id": os.environ.get("MY_CUSTOM_ACCESS_KEY"),
1796
- "aws_secret_access_key": os.environ.get("MY_CUSTOM_SECRET_KEY"),
1797
- },
1798
- }):
1799
- load_model(
1800
- task.data.model_ref,
1801
- "test-models"
1802
- )
1803
- ```
1804
- Parameters:
1704
+ Parameters
1805
1705
  ----------
1706
+ packages : Dict[str, str], default {}
1707
+ Packages to use for this flow. The key is the name of the package
1708
+ and the value is the version to use.
1709
+ libraries : Dict[str, str], default {}
1710
+ Supported for backward compatibility. When used with packages, packages will take precedence.
1711
+ python : str, optional, default None
1712
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1713
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1714
+ disabled : bool, default False
1715
+ If set to True, disables Conda.
1716
+ """
1717
+ ...
1718
+
1719
+ @typing.overload
1720
+ def conda_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1721
+ ...
1722
+
1723
+ def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1724
+ """
1725
+ Specifies the Conda environment for all steps of the flow.
1806
1726
 
1807
- type: str
1808
- The type of the datastore. Can be one of 's3', 'gcs', 'azure' or any other supported metaflow Datastore.
1727
+ Use `@conda_base` to set common libraries required by all
1728
+ steps and use `@conda` to specify step-specific additions.
1809
1729
 
1810
- config: dict or Callable
1811
- Dictionary of configuration options for the datastore. The following keys are required:
1812
- - root: The root path in the datastore where the data will be saved. (needs to be in the format expected by the datastore)
1813
- - example: 's3://bucket-name/path/to/root'
1814
- - example: 'gs://bucket-name/path/to/root'
1815
- - example: 'https://myblockacc.blob.core.windows.net/metaflow/'
1816
- - role_arn (optional): AWS IAM role to access s3 bucket (only when `type` is 's3')
1817
- - session_vars (optional): AWS session variables to access s3 bucket (only when `type` is 's3')
1818
- - client_params (optional): AWS client parameters to access s3 bucket (only when `type` is 's3')
1730
+
1731
+ Parameters
1732
+ ----------
1733
+ packages : Dict[str, str], default {}
1734
+ Packages to use for this flow. The key is the name of the package
1735
+ and the value is the version to use.
1736
+ libraries : Dict[str, str], default {}
1737
+ Supported for backward compatibility. When used with packages, packages will take precedence.
1738
+ python : str, optional, default None
1739
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1740
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1741
+ disabled : bool, default False
1742
+ If set to True, disables Conda.
1819
1743
  """
1820
1744
  ...
1821
1745
 
1822
1746
  @typing.overload
1823
- def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1747
+ def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1824
1748
  """
1825
- Specifies the flow(s) that this flow depends on.
1749
+ Specifies the event(s) that this flow depends on.
1826
1750
 
1827
1751
  ```
1828
- @trigger_on_finish(flow='FooFlow')
1752
+ @trigger(event='foo')
1829
1753
  ```
1830
1754
  or
1831
1755
  ```
1832
- @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1756
+ @trigger(events=['foo', 'bar'])
1833
1757
  ```
1834
- This decorator respects the @project decorator and triggers the flow
1835
- when upstream runs within the same namespace complete successfully
1836
1758
 
1837
- Additionally, you can specify project aware upstream flow dependencies
1838
- by specifying the fully qualified project_flow_name.
1759
+ Additionally, you can specify the parameter mappings
1760
+ to map event payload to Metaflow parameters for the flow.
1839
1761
  ```
1840
- @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1762
+ @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
1841
1763
  ```
1842
1764
  or
1843
1765
  ```
1844
- @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1766
+ @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
1767
+ {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
1845
1768
  ```
1846
1769
 
1847
- You can also specify just the project or project branch (other values will be
1848
- inferred from the current project or project branch):
1770
+ 'parameters' can also be a list of strings and tuples like so:
1849
1771
  ```
1850
- @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1772
+ @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
1773
+ ```
1774
+ This is equivalent to:
1775
+ ```
1776
+ @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
1851
1777
  ```
1852
-
1853
- Note that `branch` is typically one of:
1854
- - `prod`
1855
- - `user.bob`
1856
- - `test.my_experiment`
1857
- - `prod.staging`
1858
1778
 
1859
1779
 
1860
1780
  Parameters
1861
1781
  ----------
1862
- flow : Union[str, Dict[str, str]], optional, default None
1863
- Upstream flow dependency for this flow.
1864
- flows : List[Union[str, Dict[str, str]]], default []
1865
- Upstream flow dependencies for this flow.
1782
+ event : Union[str, Dict[str, Any]], optional, default None
1783
+ Event dependency for this flow.
1784
+ events : List[Union[str, Dict[str, Any]]], default []
1785
+ Events dependency for this flow.
1866
1786
  options : Dict[str, Any], default {}
1867
1787
  Backend-specific configuration for tuning eventing behavior.
1868
1788
  """
1869
1789
  ...
1870
1790
 
1871
1791
  @typing.overload
1872
- def trigger_on_finish(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1792
+ def trigger(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1873
1793
  ...
1874
1794
 
1875
- def trigger_on_finish(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}):
1795
+ def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}):
1876
1796
  """
1877
- Specifies the flow(s) that this flow depends on.
1797
+ Specifies the event(s) that this flow depends on.
1878
1798
 
1879
1799
  ```
1880
- @trigger_on_finish(flow='FooFlow')
1800
+ @trigger(event='foo')
1881
1801
  ```
1882
1802
  or
1883
1803
  ```
1884
- @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1804
+ @trigger(events=['foo', 'bar'])
1885
1805
  ```
1886
- This decorator respects the @project decorator and triggers the flow
1887
- when upstream runs within the same namespace complete successfully
1888
1806
 
1889
- Additionally, you can specify project aware upstream flow dependencies
1890
- by specifying the fully qualified project_flow_name.
1807
+ Additionally, you can specify the parameter mappings
1808
+ to map event payload to Metaflow parameters for the flow.
1891
1809
  ```
1892
- @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1810
+ @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
1893
1811
  ```
1894
1812
  or
1895
1813
  ```
1896
- @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1814
+ @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
1815
+ {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
1897
1816
  ```
1898
1817
 
1899
- You can also specify just the project or project branch (other values will be
1900
- inferred from the current project or project branch):
1818
+ 'parameters' can also be a list of strings and tuples like so:
1901
1819
  ```
1902
- @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1820
+ @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
1821
+ ```
1822
+ This is equivalent to:
1823
+ ```
1824
+ @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
1903
1825
  ```
1904
-
1905
- Note that `branch` is typically one of:
1906
- - `prod`
1907
- - `user.bob`
1908
- - `test.my_experiment`
1909
- - `prod.staging`
1910
1826
 
1911
1827
 
1912
1828
  Parameters
1913
1829
  ----------
1914
- flow : Union[str, Dict[str, str]], optional, default None
1915
- Upstream flow dependency for this flow.
1916
- flows : List[Union[str, Dict[str, str]]], default []
1917
- Upstream flow dependencies for this flow.
1830
+ event : Union[str, Dict[str, Any]], optional, default None
1831
+ Event dependency for this flow.
1832
+ events : List[Union[str, Dict[str, Any]]], default []
1833
+ Events dependency for this flow.
1918
1834
  options : Dict[str, Any], default {}
1919
1835
  Backend-specific configuration for tuning eventing behavior.
1920
1836
  """
1921
1837
  ...
1922
1838
 
1839
+ @typing.overload
1840
+ def pypi_base(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1841
+ """
1842
+ Specifies the PyPI packages for all steps of the flow.
1843
+
1844
+ Use `@pypi_base` to set common packages required by all
1845
+ steps and use `@pypi` to specify step-specific overrides.
1846
+
1847
+ Parameters
1848
+ ----------
1849
+ packages : Dict[str, str], default: {}
1850
+ Packages to use for this flow. The key is the name of the package
1851
+ and the value is the version to use.
1852
+ python : str, optional, default: None
1853
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1854
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1855
+ """
1856
+ ...
1857
+
1858
+ @typing.overload
1859
+ def pypi_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1860
+ ...
1861
+
1862
+ def pypi_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
1863
+ """
1864
+ Specifies the PyPI packages for all steps of the flow.
1865
+
1866
+ Use `@pypi_base` to set common packages required by all
1867
+ steps and use `@pypi` to specify step-specific overrides.
1868
+
1869
+ Parameters
1870
+ ----------
1871
+ packages : Dict[str, str], default: {}
1872
+ Packages to use for this flow. The key is the name of the package
1873
+ and the value is the version to use.
1874
+ python : str, optional, default: None
1875
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1876
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1877
+ """
1878
+ ...
1879
+
1880
+ def airflow_external_task_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, external_dag_id: str, external_task_ids: typing.List[str], allowed_states: typing.List[str], failed_states: typing.List[str], execution_delta: "datetime.timedelta", check_existence: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1881
+ """
1882
+ The `@airflow_external_task_sensor` decorator attaches a Airflow [ExternalTaskSensor](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/sensors/external_task/index.html#airflow.sensors.external_task.ExternalTaskSensor) before the start step of the flow.
1883
+ This decorator only works when a flow is scheduled on Airflow and is compiled using `airflow create`. More than one `@airflow_external_task_sensor` can be added as a flow decorators. Adding more than one decorator will ensure that `start` step starts only after all sensors finish.
1884
+
1885
+
1886
+ Parameters
1887
+ ----------
1888
+ timeout : int
1889
+ Time, in seconds before the task times out and fails. (Default: 3600)
1890
+ poke_interval : int
1891
+ Time in seconds that the job should wait in between each try. (Default: 60)
1892
+ mode : str
1893
+ How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1894
+ exponential_backoff : bool
1895
+ allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1896
+ pool : str
1897
+ the slot pool this task should run in,
1898
+ slot pools are a way to limit concurrency for certain tasks. (Default:None)
1899
+ soft_fail : bool
1900
+ Set to true to mark the task as SKIPPED on failure. (Default: False)
1901
+ name : str
1902
+ Name of the sensor on Airflow
1903
+ description : str
1904
+ Description of sensor in the Airflow UI
1905
+ external_dag_id : str
1906
+ The dag_id that contains the task you want to wait for.
1907
+ external_task_ids : List[str]
1908
+ The list of task_ids that you want to wait for.
1909
+ If None (default value) the sensor waits for the DAG. (Default: None)
1910
+ allowed_states : List[str]
1911
+ Iterable of allowed states, (Default: ['success'])
1912
+ failed_states : List[str]
1913
+ Iterable of failed or dis-allowed states. (Default: None)
1914
+ execution_delta : datetime.timedelta
1915
+ time difference with the previous execution to look at,
1916
+ the default is the same logical date as the current task or DAG. (Default: None)
1917
+ check_existence: bool
1918
+ Set to True to check if the external task exists or check if
1919
+ the DAG to wait for exists. (Default: True)
1920
+ """
1921
+ ...
1922
+
1923
1923
  pkg_name: str
1924
1924