ob-metaflow-stubs 6.0.6.2__py2.py3-none-any.whl → 6.0.7.1__py2.py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of ob-metaflow-stubs might be problematic. Click here for more details.

Files changed (261) hide show
  1. metaflow-stubs/__init__.pyi +870 -870
  2. metaflow-stubs/cards.pyi +1 -1
  3. metaflow-stubs/cli.pyi +1 -1
  4. metaflow-stubs/cli_components/__init__.pyi +1 -1
  5. metaflow-stubs/cli_components/utils.pyi +1 -1
  6. metaflow-stubs/client/__init__.pyi +1 -1
  7. metaflow-stubs/client/core.pyi +6 -6
  8. metaflow-stubs/client/filecache.pyi +2 -2
  9. metaflow-stubs/events.pyi +1 -1
  10. metaflow-stubs/exception.pyi +1 -1
  11. metaflow-stubs/flowspec.pyi +4 -4
  12. metaflow-stubs/generated_for.txt +1 -1
  13. metaflow-stubs/includefile.pyi +3 -3
  14. metaflow-stubs/meta_files.pyi +1 -1
  15. metaflow-stubs/metadata_provider/__init__.pyi +1 -1
  16. metaflow-stubs/metadata_provider/heartbeat.pyi +1 -1
  17. metaflow-stubs/metadata_provider/metadata.pyi +2 -2
  18. metaflow-stubs/metadata_provider/util.pyi +1 -1
  19. metaflow-stubs/metaflow_config.pyi +1 -1
  20. metaflow-stubs/metaflow_current.pyi +29 -29
  21. metaflow-stubs/metaflow_git.pyi +1 -1
  22. metaflow-stubs/mf_extensions/__init__.pyi +1 -1
  23. metaflow-stubs/mf_extensions/obcheckpoint/__init__.pyi +1 -1
  24. metaflow-stubs/mf_extensions/obcheckpoint/plugins/__init__.pyi +1 -1
  25. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/__init__.pyi +1 -1
  26. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/__init__.pyi +1 -1
  27. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/async_cards.pyi +2 -2
  28. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/deco_injection_mixin.pyi +1 -1
  29. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/card_utils/extra_components.pyi +3 -3
  30. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/__init__.pyi +1 -1
  31. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/cards/__init__.pyi +1 -1
  32. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/cards/checkpoint_lister.pyi +4 -4
  33. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/cards/lineage_card.pyi +1 -1
  34. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/checkpoint_storage.pyi +4 -4
  35. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/constructors.pyi +1 -1
  36. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/core.pyi +2 -2
  37. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/decorator.pyi +4 -4
  38. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/exceptions.pyi +1 -1
  39. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/final_api.pyi +2 -2
  40. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/checkpoints/lineage.pyi +1 -1
  41. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/__init__.pyi +1 -1
  42. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/context.pyi +2 -2
  43. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/core.pyi +1 -1
  44. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/decorator.pyi +1 -1
  45. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/exceptions.pyi +1 -1
  46. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/task_utils.pyi +3 -3
  47. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastore/utils.pyi +1 -1
  48. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/datastructures.pyi +2 -2
  49. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/exceptions.pyi +1 -1
  50. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/hf_hub/__init__.pyi +1 -1
  51. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/hf_hub/decorator.pyi +1 -1
  52. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/__init__.pyi +1 -1
  53. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/core.pyi +2 -2
  54. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/exceptions.pyi +1 -1
  55. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/modeling_utils/model_storage.pyi +3 -3
  56. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/__init__.pyi +1 -1
  57. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/flowspec_utils.pyi +1 -1
  58. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/general.pyi +1 -1
  59. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/identity_utils.pyi +2 -2
  60. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/serialization_handler/__init__.pyi +1 -1
  61. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/serialization_handler/base.pyi +1 -1
  62. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/serialization_handler/tar.pyi +2 -2
  63. metaflow-stubs/mf_extensions/obcheckpoint/plugins/machine_learning_utilities/utils/tar_utils.pyi +2 -2
  64. metaflow-stubs/mf_extensions/outerbounds/__init__.pyi +1 -1
  65. metaflow-stubs/mf_extensions/outerbounds/plugins/__init__.pyi +1 -1
  66. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/__init__.pyi +1 -1
  67. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/__init__.pyi +1 -1
  68. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_state_machine.pyi +1 -1
  69. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_vendor/__init__.pyi +1 -1
  70. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_vendor/spinner/__init__.pyi +1 -1
  71. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/_vendor/spinner/spinners.pyi +1 -1
  72. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/app_cli.pyi +2 -2
  73. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/app_config.pyi +2 -2
  74. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/capsule.pyi +1 -1
  75. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/click_importer.pyi +1 -1
  76. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/code_package/__init__.pyi +1 -1
  77. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/code_package/code_packager.pyi +1 -1
  78. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/__init__.pyi +1 -1
  79. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/cli_generator.pyi +1 -1
  80. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/config_utils.pyi +4 -4
  81. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/schema_export.pyi +1 -1
  82. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/typed_configs.pyi +2 -2
  83. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/config/unified_config.pyi +2 -2
  84. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/dependencies.pyi +2 -2
  85. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/deployer.pyi +5 -5
  86. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/experimental/__init__.pyi +1 -1
  87. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/perimeters.pyi +1 -1
  88. metaflow-stubs/mf_extensions/outerbounds/plugins/apps/core/utils.pyi +3 -3
  89. metaflow-stubs/mf_extensions/outerbounds/plugins/aws/__init__.pyi +1 -1
  90. metaflow-stubs/mf_extensions/outerbounds/plugins/aws/assume_role_decorator.pyi +2 -2
  91. metaflow-stubs/mf_extensions/outerbounds/plugins/card_utilities/__init__.pyi +1 -1
  92. metaflow-stubs/mf_extensions/outerbounds/plugins/card_utilities/async_cards.pyi +1 -1
  93. metaflow-stubs/mf_extensions/outerbounds/plugins/card_utilities/injector.pyi +1 -1
  94. metaflow-stubs/mf_extensions/outerbounds/plugins/checkpoint_datastores/__init__.pyi +1 -1
  95. metaflow-stubs/mf_extensions/outerbounds/plugins/checkpoint_datastores/coreweave.pyi +2 -2
  96. metaflow-stubs/mf_extensions/outerbounds/plugins/checkpoint_datastores/nebius.pyi +2 -2
  97. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/__init__.pyi +1 -1
  98. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/baker.pyi +3 -3
  99. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/docker_environment.pyi +2 -2
  100. metaflow-stubs/mf_extensions/outerbounds/plugins/fast_bakery/fast_bakery.pyi +1 -1
  101. metaflow-stubs/mf_extensions/outerbounds/plugins/kubernetes/__init__.pyi +1 -1
  102. metaflow-stubs/mf_extensions/outerbounds/plugins/kubernetes/pod_killer.pyi +1 -1
  103. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/__init__.pyi +1 -1
  104. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/constants.pyi +1 -1
  105. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/exceptions.pyi +1 -1
  106. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/ollama.pyi +1 -1
  107. metaflow-stubs/mf_extensions/outerbounds/plugins/ollama/status_card.pyi +1 -1
  108. metaflow-stubs/mf_extensions/outerbounds/plugins/snowflake/__init__.pyi +1 -1
  109. metaflow-stubs/mf_extensions/outerbounds/plugins/snowflake/snowflake.pyi +1 -1
  110. metaflow-stubs/mf_extensions/outerbounds/profilers/__init__.pyi +1 -1
  111. metaflow-stubs/mf_extensions/outerbounds/profilers/gpu.pyi +1 -1
  112. metaflow-stubs/mf_extensions/outerbounds/remote_config.pyi +3 -3
  113. metaflow-stubs/mf_extensions/outerbounds/toplevel/__init__.pyi +1 -1
  114. metaflow-stubs/mf_extensions/outerbounds/toplevel/global_aliases_for_metaflow_package.pyi +1 -1
  115. metaflow-stubs/mf_extensions/outerbounds/toplevel/s3_proxy.pyi +1 -1
  116. metaflow-stubs/multicore_utils.pyi +1 -1
  117. metaflow-stubs/ob_internal.pyi +1 -1
  118. metaflow-stubs/packaging_sys/__init__.pyi +5 -5
  119. metaflow-stubs/packaging_sys/backend.pyi +3 -3
  120. metaflow-stubs/packaging_sys/distribution_support.pyi +3 -3
  121. metaflow-stubs/packaging_sys/tar_backend.pyi +6 -6
  122. metaflow-stubs/packaging_sys/utils.pyi +1 -1
  123. metaflow-stubs/packaging_sys/v1.pyi +3 -3
  124. metaflow-stubs/parameters.pyi +3 -3
  125. metaflow-stubs/plugins/__init__.pyi +11 -11
  126. metaflow-stubs/plugins/airflow/__init__.pyi +1 -1
  127. metaflow-stubs/plugins/airflow/airflow_utils.pyi +1 -1
  128. metaflow-stubs/plugins/airflow/exception.pyi +1 -1
  129. metaflow-stubs/plugins/airflow/sensors/__init__.pyi +1 -1
  130. metaflow-stubs/plugins/airflow/sensors/base_sensor.pyi +1 -1
  131. metaflow-stubs/plugins/airflow/sensors/external_task_sensor.pyi +1 -1
  132. metaflow-stubs/plugins/airflow/sensors/s3_sensor.pyi +1 -1
  133. metaflow-stubs/plugins/argo/__init__.pyi +1 -1
  134. metaflow-stubs/plugins/argo/argo_client.pyi +1 -1
  135. metaflow-stubs/plugins/argo/argo_events.pyi +1 -1
  136. metaflow-stubs/plugins/argo/argo_workflows.pyi +2 -2
  137. metaflow-stubs/plugins/argo/argo_workflows_decorator.pyi +2 -2
  138. metaflow-stubs/plugins/argo/argo_workflows_deployer.pyi +2 -2
  139. metaflow-stubs/plugins/argo/argo_workflows_deployer_objects.pyi +3 -3
  140. metaflow-stubs/plugins/argo/exit_hooks.pyi +2 -2
  141. metaflow-stubs/plugins/aws/__init__.pyi +1 -1
  142. metaflow-stubs/plugins/aws/aws_client.pyi +1 -1
  143. metaflow-stubs/plugins/aws/aws_utils.pyi +1 -1
  144. metaflow-stubs/plugins/aws/batch/__init__.pyi +1 -1
  145. metaflow-stubs/plugins/aws/batch/batch.pyi +1 -1
  146. metaflow-stubs/plugins/aws/batch/batch_client.pyi +1 -1
  147. metaflow-stubs/plugins/aws/batch/batch_decorator.pyi +1 -1
  148. metaflow-stubs/plugins/aws/secrets_manager/__init__.pyi +1 -1
  149. metaflow-stubs/plugins/aws/secrets_manager/aws_secrets_manager_secrets_provider.pyi +3 -3
  150. metaflow-stubs/plugins/aws/step_functions/__init__.pyi +1 -1
  151. metaflow-stubs/plugins/aws/step_functions/event_bridge_client.pyi +1 -1
  152. metaflow-stubs/plugins/aws/step_functions/schedule_decorator.pyi +1 -1
  153. metaflow-stubs/plugins/aws/step_functions/step_functions.pyi +1 -1
  154. metaflow-stubs/plugins/aws/step_functions/step_functions_client.pyi +1 -1
  155. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer.pyi +2 -2
  156. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer_objects.pyi +3 -3
  157. metaflow-stubs/plugins/azure/__init__.pyi +1 -1
  158. metaflow-stubs/plugins/azure/azure_credential.pyi +1 -1
  159. metaflow-stubs/plugins/azure/azure_exceptions.pyi +1 -1
  160. metaflow-stubs/plugins/azure/azure_secret_manager_secrets_provider.pyi +3 -3
  161. metaflow-stubs/plugins/azure/azure_utils.pyi +1 -1
  162. metaflow-stubs/plugins/azure/blob_service_client_factory.pyi +1 -1
  163. metaflow-stubs/plugins/azure/includefile_support.pyi +1 -1
  164. metaflow-stubs/plugins/cards/__init__.pyi +1 -1
  165. metaflow-stubs/plugins/cards/card_client.pyi +2 -2
  166. metaflow-stubs/plugins/cards/card_creator.pyi +1 -1
  167. metaflow-stubs/plugins/cards/card_datastore.pyi +1 -1
  168. metaflow-stubs/plugins/cards/card_decorator.pyi +2 -2
  169. metaflow-stubs/plugins/cards/card_modules/__init__.pyi +1 -1
  170. metaflow-stubs/plugins/cards/card_modules/basic.pyi +1 -1
  171. metaflow-stubs/plugins/cards/card_modules/card.pyi +1 -1
  172. metaflow-stubs/plugins/cards/card_modules/components.pyi +2 -2
  173. metaflow-stubs/plugins/cards/card_modules/convert_to_native_type.pyi +1 -1
  174. metaflow-stubs/plugins/cards/card_modules/renderer_tools.pyi +1 -1
  175. metaflow-stubs/plugins/cards/card_modules/test_cards.pyi +1 -1
  176. metaflow-stubs/plugins/cards/card_resolver.pyi +1 -1
  177. metaflow-stubs/plugins/cards/component_serializer.pyi +1 -1
  178. metaflow-stubs/plugins/cards/exception.pyi +1 -1
  179. metaflow-stubs/plugins/catch_decorator.pyi +1 -1
  180. metaflow-stubs/plugins/datatools/__init__.pyi +1 -1
  181. metaflow-stubs/plugins/datatools/local.pyi +1 -1
  182. metaflow-stubs/plugins/datatools/s3/__init__.pyi +1 -1
  183. metaflow-stubs/plugins/datatools/s3/s3.pyi +2 -2
  184. metaflow-stubs/plugins/datatools/s3/s3tail.pyi +1 -1
  185. metaflow-stubs/plugins/datatools/s3/s3util.pyi +1 -1
  186. metaflow-stubs/plugins/debug_logger.pyi +1 -1
  187. metaflow-stubs/plugins/debug_monitor.pyi +1 -1
  188. metaflow-stubs/plugins/environment_decorator.pyi +1 -1
  189. metaflow-stubs/plugins/events_decorator.pyi +1 -1
  190. metaflow-stubs/plugins/exit_hook/__init__.pyi +1 -1
  191. metaflow-stubs/plugins/exit_hook/exit_hook_decorator.pyi +1 -1
  192. metaflow-stubs/plugins/frameworks/__init__.pyi +1 -1
  193. metaflow-stubs/plugins/frameworks/pytorch.pyi +1 -1
  194. metaflow-stubs/plugins/gcp/__init__.pyi +1 -1
  195. metaflow-stubs/plugins/gcp/gcp_secret_manager_secrets_provider.pyi +3 -3
  196. metaflow-stubs/plugins/gcp/gs_exceptions.pyi +1 -1
  197. metaflow-stubs/plugins/gcp/gs_storage_client_factory.pyi +1 -1
  198. metaflow-stubs/plugins/gcp/gs_utils.pyi +1 -1
  199. metaflow-stubs/plugins/gcp/includefile_support.pyi +1 -1
  200. metaflow-stubs/plugins/kubernetes/__init__.pyi +1 -1
  201. metaflow-stubs/plugins/kubernetes/kube_utils.pyi +2 -2
  202. metaflow-stubs/plugins/kubernetes/kubernetes.pyi +1 -1
  203. metaflow-stubs/plugins/kubernetes/kubernetes_client.pyi +1 -1
  204. metaflow-stubs/plugins/kubernetes/kubernetes_decorator.pyi +1 -1
  205. metaflow-stubs/plugins/kubernetes/kubernetes_jobsets.pyi +1 -1
  206. metaflow-stubs/plugins/kubernetes/spot_monitor_sidecar.pyi +1 -1
  207. metaflow-stubs/plugins/ollama/__init__.pyi +2 -2
  208. metaflow-stubs/plugins/parallel_decorator.pyi +1 -1
  209. metaflow-stubs/plugins/perimeters.pyi +1 -1
  210. metaflow-stubs/plugins/project_decorator.pyi +1 -1
  211. metaflow-stubs/plugins/pypi/__init__.pyi +1 -1
  212. metaflow-stubs/plugins/pypi/conda_decorator.pyi +1 -1
  213. metaflow-stubs/plugins/pypi/conda_environment.pyi +4 -4
  214. metaflow-stubs/plugins/pypi/parsers.pyi +1 -1
  215. metaflow-stubs/plugins/pypi/pypi_decorator.pyi +1 -1
  216. metaflow-stubs/plugins/pypi/pypi_environment.pyi +1 -1
  217. metaflow-stubs/plugins/pypi/utils.pyi +1 -1
  218. metaflow-stubs/plugins/resources_decorator.pyi +1 -1
  219. metaflow-stubs/plugins/retry_decorator.pyi +1 -1
  220. metaflow-stubs/plugins/secrets/__init__.pyi +2 -2
  221. metaflow-stubs/plugins/secrets/inline_secrets_provider.pyi +3 -3
  222. metaflow-stubs/plugins/secrets/secrets_decorator.pyi +1 -1
  223. metaflow-stubs/plugins/secrets/secrets_func.pyi +1 -1
  224. metaflow-stubs/plugins/secrets/secrets_spec.pyi +1 -1
  225. metaflow-stubs/plugins/secrets/utils.pyi +1 -1
  226. metaflow-stubs/plugins/snowflake/__init__.pyi +1 -1
  227. metaflow-stubs/plugins/storage_executor.pyi +1 -1
  228. metaflow-stubs/plugins/test_unbounded_foreach_decorator.pyi +1 -1
  229. metaflow-stubs/plugins/timeout_decorator.pyi +1 -1
  230. metaflow-stubs/plugins/torchtune/__init__.pyi +1 -1
  231. metaflow-stubs/plugins/uv/__init__.pyi +1 -1
  232. metaflow-stubs/plugins/uv/uv_environment.pyi +2 -2
  233. metaflow-stubs/profilers/__init__.pyi +1 -1
  234. metaflow-stubs/pylint_wrapper.pyi +1 -1
  235. metaflow-stubs/runner/__init__.pyi +1 -1
  236. metaflow-stubs/runner/deployer.pyi +4 -4
  237. metaflow-stubs/runner/deployer_impl.pyi +1 -1
  238. metaflow-stubs/runner/metaflow_runner.pyi +3 -3
  239. metaflow-stubs/runner/nbdeploy.pyi +1 -1
  240. metaflow-stubs/runner/nbrun.pyi +1 -1
  241. metaflow-stubs/runner/subprocess_manager.pyi +1 -1
  242. metaflow-stubs/runner/utils.pyi +2 -2
  243. metaflow-stubs/system/__init__.pyi +1 -1
  244. metaflow-stubs/system/system_logger.pyi +2 -2
  245. metaflow-stubs/system/system_monitor.pyi +1 -1
  246. metaflow-stubs/tagging_util.pyi +1 -1
  247. metaflow-stubs/tuple_util.pyi +1 -1
  248. metaflow-stubs/user_configs/__init__.pyi +1 -1
  249. metaflow-stubs/user_configs/config_options.pyi +2 -2
  250. metaflow-stubs/user_configs/config_parameters.pyi +6 -6
  251. metaflow-stubs/user_decorators/__init__.pyi +1 -1
  252. metaflow-stubs/user_decorators/common.pyi +1 -1
  253. metaflow-stubs/user_decorators/mutable_flow.pyi +5 -5
  254. metaflow-stubs/user_decorators/mutable_step.pyi +5 -5
  255. metaflow-stubs/user_decorators/user_flow_decorator.pyi +5 -5
  256. metaflow-stubs/user_decorators/user_step_decorator.pyi +5 -5
  257. {ob_metaflow_stubs-6.0.6.2.dist-info → ob_metaflow_stubs-6.0.7.1.dist-info}/METADATA +1 -1
  258. ob_metaflow_stubs-6.0.7.1.dist-info/RECORD +261 -0
  259. ob_metaflow_stubs-6.0.6.2.dist-info/RECORD +0 -261
  260. {ob_metaflow_stubs-6.0.6.2.dist-info → ob_metaflow_stubs-6.0.7.1.dist-info}/WHEEL +0 -0
  261. {ob_metaflow_stubs-6.0.6.2.dist-info → ob_metaflow_stubs-6.0.7.1.dist-info}/top_level.txt +0 -0
@@ -1,7 +1,7 @@
1
1
  ######################################################################################################
2
2
  # Auto-generated Metaflow stub file #
3
3
  # MF version: 2.17.1.0+obcheckpoint(0.2.4);ob(v1) #
4
- # Generated on 2025-08-11T21:26:28.605682 #
4
+ # Generated on 2025-08-19T19:04:22.043902 #
5
5
  ######################################################################################################
6
6
 
7
7
  from __future__ import annotations
@@ -39,18 +39,18 @@ from .user_decorators.user_step_decorator import UserStepDecorator as UserStepDe
39
39
  from .user_decorators.user_step_decorator import StepMutator as StepMutator
40
40
  from .user_decorators.user_step_decorator import user_step_decorator as user_step_decorator
41
41
  from .user_decorators.user_flow_decorator import FlowMutator as FlowMutator
42
+ from . import cards as cards
42
43
  from . import metaflow_git as metaflow_git
43
44
  from . import tuple_util as tuple_util
44
- from . import cards as cards
45
45
  from . import events as events
46
46
  from . import runner as runner
47
47
  from . import plugins as plugins
48
48
  from .mf_extensions.outerbounds.toplevel.global_aliases_for_metaflow_package import S3 as S3
49
49
  from . import includefile as includefile
50
50
  from .includefile import IncludeFile as IncludeFile
51
- from .plugins.pypi.parsers import conda_environment_yml_parser as conda_environment_yml_parser
52
- from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
53
51
  from .plugins.pypi.parsers import pyproject_toml_parser as pyproject_toml_parser
52
+ from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
53
+ from .plugins.pypi.parsers import conda_environment_yml_parser as conda_environment_yml_parser
54
54
  from . import client as client
55
55
  from .client.core import namespace as namespace
56
56
  from .client.core import get_namespace as get_namespace
@@ -167,78 +167,75 @@ def step(f: typing.Union[typing.Callable[[FlowSpecDerived], None], typing.Callab
167
167
  """
168
168
  ...
169
169
 
170
- @typing.overload
171
- def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
172
- """
173
- Decorator prototype for all step decorators. This function gets specialized
174
- and imported for all decorators types by _import_plugin_decorators().
175
- """
176
- ...
177
-
178
- @typing.overload
179
- def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
180
- ...
181
-
182
- def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
170
+ def nvidia(*, gpu: int, gpu_type: str, queue_timeout: int) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
183
171
  """
184
- Decorator prototype for all step decorators. This function gets specialized
185
- and imported for all decorators types by _import_plugin_decorators().
172
+ Specifies that this step should execute on DGX cloud.
173
+
174
+
175
+ Parameters
176
+ ----------
177
+ gpu : int
178
+ Number of GPUs to use.
179
+ gpu_type : str
180
+ Type of Nvidia GPU to use.
181
+ queue_timeout : int
182
+ Time to keep the job in NVCF's queue.
186
183
  """
187
184
  ...
188
185
 
189
186
  @typing.overload
190
- def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
187
+ def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
191
188
  """
192
- Specifies the PyPI packages for the step.
189
+ Specifies that the step will success under all circumstances.
193
190
 
194
- Information in this decorator will augment any
195
- attributes set in the `@pyi_base` flow-level decorator. Hence,
196
- you can use `@pypi_base` to set packages required by all
197
- steps and use `@pypi` to specify step-specific overrides.
191
+ The decorator will create an optional artifact, specified by `var`, which
192
+ contains the exception raised. You can use it to detect the presence
193
+ of errors, indicating that all happy-path artifacts produced by the step
194
+ are missing.
198
195
 
199
196
 
200
197
  Parameters
201
198
  ----------
202
- packages : Dict[str, str], default: {}
203
- Packages to use for this step. The key is the name of the package
204
- and the value is the version to use.
205
- python : str, optional, default: None
206
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
207
- that the version used will correspond to the version of the Python interpreter used to start the run.
199
+ var : str, optional, default None
200
+ Name of the artifact in which to store the caught exception.
201
+ If not specified, the exception is not stored.
202
+ print_exception : bool, default True
203
+ Determines whether or not the exception is printed to
204
+ stdout when caught.
208
205
  """
209
206
  ...
210
207
 
211
208
  @typing.overload
212
- def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
209
+ def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
213
210
  ...
214
211
 
215
212
  @typing.overload
216
- def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
213
+ def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
217
214
  ...
218
215
 
219
- def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
216
+ def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
220
217
  """
221
- Specifies the PyPI packages for the step.
218
+ Specifies that the step will success under all circumstances.
222
219
 
223
- Information in this decorator will augment any
224
- attributes set in the `@pyi_base` flow-level decorator. Hence,
225
- you can use `@pypi_base` to set packages required by all
226
- steps and use `@pypi` to specify step-specific overrides.
220
+ The decorator will create an optional artifact, specified by `var`, which
221
+ contains the exception raised. You can use it to detect the presence
222
+ of errors, indicating that all happy-path artifacts produced by the step
223
+ are missing.
227
224
 
228
225
 
229
226
  Parameters
230
227
  ----------
231
- packages : Dict[str, str], default: {}
232
- Packages to use for this step. The key is the name of the package
233
- and the value is the version to use.
234
- python : str, optional, default: None
235
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
236
- that the version used will correspond to the version of the Python interpreter used to start the run.
228
+ var : str, optional, default None
229
+ Name of the artifact in which to store the caught exception.
230
+ If not specified, the exception is not stored.
231
+ print_exception : bool, default True
232
+ Determines whether or not the exception is printed to
233
+ stdout when caught.
237
234
  """
238
235
  ...
239
236
 
240
237
  @typing.overload
241
- def app_deploy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
238
+ def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
242
239
  """
243
240
  Decorator prototype for all step decorators. This function gets specialized
244
241
  and imported for all decorators types by _import_plugin_decorators().
@@ -246,10 +243,10 @@ def app_deploy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.
246
243
  ...
247
244
 
248
245
  @typing.overload
249
- def app_deploy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
246
+ def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
250
247
  ...
251
248
 
252
- def app_deploy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
249
+ def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
253
250
  """
254
251
  Decorator prototype for all step decorators. This function gets specialized
255
252
  and imported for all decorators types by _import_plugin_decorators().
@@ -257,194 +254,116 @@ def app_deploy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None
257
254
  ...
258
255
 
259
256
  @typing.overload
260
- def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
257
+ def test_append_card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
261
258
  """
262
- Internal decorator to support Fast bakery
259
+ A simple decorator that demonstrates using CardDecoratorInjector
260
+ to inject a card and render simple markdown content.
263
261
  """
264
262
  ...
265
263
 
266
264
  @typing.overload
267
- def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
265
+ def test_append_card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
268
266
  ...
269
267
 
270
- def fast_bakery_internal(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
268
+ def test_append_card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
271
269
  """
272
- Internal decorator to support Fast bakery
270
+ A simple decorator that demonstrates using CardDecoratorInjector
271
+ to inject a card and render simple markdown content.
273
272
  """
274
273
  ...
275
274
 
276
- @typing.overload
277
- def checkpoint(*, load_policy: str = 'fresh', temp_dir_root: str = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
275
+ def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: typing.Optional[str] = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.Optional[typing.List[str]] = None, node_selector: typing.Union[typing.Dict[str, str], str, None] = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: typing.Optional[int] = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = '/metaflow_temp', persistent_volume_claims: typing.Optional[typing.Dict[str, str]] = None, shared_memory: typing.Optional[int] = None, port: typing.Optional[int] = None, compute_pool: typing.Optional[str] = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Optional[typing.Dict[str, typing.Any]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
278
276
  """
279
- Enables checkpointing for a step.
280
-
281
- > Examples
282
-
283
- - Saving Checkpoints
284
-
285
- ```python
286
- @checkpoint
287
- @step
288
- def train(self):
289
- model = create_model(self.parameters, checkpoint_path = None)
290
- for i in range(self.epochs):
291
- # some training logic
292
- loss = model.train(self.dataset)
293
- if i % 10 == 0:
294
- model.save(
295
- current.checkpoint.directory,
296
- )
297
- # saves the contents of the `current.checkpoint.directory` as a checkpoint
298
- # and returns a reference dictionary to the checkpoint saved in the datastore
299
- self.latest_checkpoint = current.checkpoint.save(
300
- name="epoch_checkpoint",
301
- metadata={
302
- "epoch": i,
303
- "loss": loss,
304
- }
305
- )
306
- ```
307
-
308
- - Using Loaded Checkpoints
309
-
310
- ```python
311
- @retry(times=3)
312
- @checkpoint
313
- @step
314
- def train(self):
315
- # Assume that the task has restarted and the previous attempt of the task
316
- # saved a checkpoint
317
- checkpoint_path = None
318
- if current.checkpoint.is_loaded: # Check if a checkpoint is loaded
319
- print("Loaded checkpoint from the previous attempt")
320
- checkpoint_path = current.checkpoint.directory
321
-
322
- model = create_model(self.parameters, checkpoint_path = checkpoint_path)
323
- for i in range(self.epochs):
324
- ...
325
- ```
277
+ Specifies that this step should execute on Kubernetes.
326
278
 
327
279
 
328
280
  Parameters
329
281
  ----------
330
- load_policy : str, default: "fresh"
331
- The policy for loading the checkpoint. The following policies are supported:
332
- - "eager": Loads the the latest available checkpoint within the namespace.
333
- With this mode, the latest checkpoint written by any previous task (can be even a different run) of the step
334
- will be loaded at the start of the task.
335
- - "none": Do not load any checkpoint
336
- - "fresh": Loads the lastest checkpoint created within the running Task.
337
- This mode helps loading checkpoints across various retry attempts of the same task.
338
- With this mode, no checkpoint will be loaded at the start of a task but any checkpoints
339
- created within the task will be loaded when the task is retries execution on failure.
282
+ cpu : int, default 1
283
+ Number of CPUs required for this step. If `@resources` is
284
+ also present, the maximum value from all decorators is used.
285
+ memory : int, default 4096
286
+ Memory size (in MB) required for this step. If
287
+ `@resources` is also present, the maximum value from all decorators is
288
+ used.
289
+ disk : int, default 10240
290
+ Disk size (in MB) required for this step. If
291
+ `@resources` is also present, the maximum value from all decorators is
292
+ used.
293
+ image : str, optional, default None
294
+ Docker image to use when launching on Kubernetes. If not specified, and
295
+ METAFLOW_KUBERNETES_CONTAINER_IMAGE is specified, that image is used. If
296
+ not, a default Docker image mapping to the current version of Python is used.
297
+ image_pull_policy: str, default KUBERNETES_IMAGE_PULL_POLICY
298
+ If given, the imagePullPolicy to be applied to the Docker image of the step.
299
+ image_pull_secrets: List[str], default []
300
+ The default is extracted from METAFLOW_KUBERNETES_IMAGE_PULL_SECRETS.
301
+ Kubernetes image pull secrets to use when pulling container images
302
+ in Kubernetes.
303
+ service_account : str, default METAFLOW_KUBERNETES_SERVICE_ACCOUNT
304
+ Kubernetes service account to use when launching pod in Kubernetes.
305
+ secrets : List[str], optional, default None
306
+ Kubernetes secrets to use when launching pod in Kubernetes. These
307
+ secrets are in addition to the ones defined in `METAFLOW_KUBERNETES_SECRETS`
308
+ in Metaflow configuration.
309
+ node_selector: Union[Dict[str,str], str], optional, default None
310
+ Kubernetes node selector(s) to apply to the pod running the task.
311
+ Can be passed in as a comma separated string of values e.g.
312
+ 'kubernetes.io/os=linux,kubernetes.io/arch=amd64' or as a dictionary
313
+ {'kubernetes.io/os': 'linux', 'kubernetes.io/arch': 'amd64'}
314
+ namespace : str, default METAFLOW_KUBERNETES_NAMESPACE
315
+ Kubernetes namespace to use when launching pod in Kubernetes.
316
+ gpu : int, optional, default None
317
+ Number of GPUs required for this step. A value of zero implies that
318
+ the scheduled node should not have GPUs.
319
+ gpu_vendor : str, default KUBERNETES_GPU_VENDOR
320
+ The vendor of the GPUs to be used for this step.
321
+ tolerations : List[Dict[str,str]], default []
322
+ The default is extracted from METAFLOW_KUBERNETES_TOLERATIONS.
323
+ Kubernetes tolerations to use when launching pod in Kubernetes.
324
+ labels: Dict[str, str], default: METAFLOW_KUBERNETES_LABELS
325
+ Kubernetes labels to use when launching pod in Kubernetes.
326
+ annotations: Dict[str, str], default: METAFLOW_KUBERNETES_ANNOTATIONS
327
+ Kubernetes annotations to use when launching pod in Kubernetes.
328
+ use_tmpfs : bool, default False
329
+ This enables an explicit tmpfs mount for this step.
330
+ tmpfs_tempdir : bool, default True
331
+ sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
332
+ tmpfs_size : int, optional, default: None
333
+ The value for the size (in MiB) of the tmpfs mount for this step.
334
+ This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
335
+ memory allocated for this step.
336
+ tmpfs_path : str, optional, default /metaflow_temp
337
+ Path to tmpfs mount for this step.
338
+ persistent_volume_claims : Dict[str, str], optional, default None
339
+ A map (dictionary) of persistent volumes to be mounted to the pod for this step. The map is from persistent
340
+ volumes to the path to which the volume is to be mounted, e.g., `{'pvc-name': '/path/to/mount/on'}`.
341
+ shared_memory: int, optional
342
+ Shared memory size (in MiB) required for this step
343
+ port: int, optional
344
+ Port number to specify in the Kubernetes job object
345
+ compute_pool : str, optional, default None
346
+ Compute pool to be used for for this step.
347
+ If not specified, any accessible compute pool within the perimeter is used.
348
+ hostname_resolution_timeout: int, default 10 * 60
349
+ Timeout in seconds for the workers tasks in the gang scheduled cluster to resolve the hostname of control task.
350
+ Only applicable when @parallel is used.
351
+ qos: str, default: Burstable
352
+ Quality of Service class to assign to the pod. Supported values are: Guaranteed, Burstable, BestEffort
340
353
 
341
- temp_dir_root : str, default: None
342
- The root directory under which `current.checkpoint.directory` will be created.
354
+ security_context: Dict[str, Any], optional, default None
355
+ Container security context. Applies to the task container. Allows the following keys:
356
+ - privileged: bool, optional, default None
357
+ - allow_privilege_escalation: bool, optional, default None
358
+ - run_as_user: int, optional, default None
359
+ - run_as_group: int, optional, default None
360
+ - run_as_non_root: bool, optional, default None
343
361
  """
344
362
  ...
345
363
 
346
- @typing.overload
347
- def checkpoint(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
348
- ...
349
-
350
- @typing.overload
351
- def checkpoint(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
352
- ...
353
-
354
- def checkpoint(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, load_policy: str = 'fresh', temp_dir_root: str = None):
364
+ def vllm(*, model: str, backend: str, openai_api_server: bool, debug: bool, card_refresh_interval: int, max_retries: int, retry_alert_frequency: int, engine_args: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
355
365
  """
356
- Enables checkpointing for a step.
357
-
358
- > Examples
359
-
360
- - Saving Checkpoints
361
-
362
- ```python
363
- @checkpoint
364
- @step
365
- def train(self):
366
- model = create_model(self.parameters, checkpoint_path = None)
367
- for i in range(self.epochs):
368
- # some training logic
369
- loss = model.train(self.dataset)
370
- if i % 10 == 0:
371
- model.save(
372
- current.checkpoint.directory,
373
- )
374
- # saves the contents of the `current.checkpoint.directory` as a checkpoint
375
- # and returns a reference dictionary to the checkpoint saved in the datastore
376
- self.latest_checkpoint = current.checkpoint.save(
377
- name="epoch_checkpoint",
378
- metadata={
379
- "epoch": i,
380
- "loss": loss,
381
- }
382
- )
383
- ```
384
-
385
- - Using Loaded Checkpoints
386
-
387
- ```python
388
- @retry(times=3)
389
- @checkpoint
390
- @step
391
- def train(self):
392
- # Assume that the task has restarted and the previous attempt of the task
393
- # saved a checkpoint
394
- checkpoint_path = None
395
- if current.checkpoint.is_loaded: # Check if a checkpoint is loaded
396
- print("Loaded checkpoint from the previous attempt")
397
- checkpoint_path = current.checkpoint.directory
398
-
399
- model = create_model(self.parameters, checkpoint_path = checkpoint_path)
400
- for i in range(self.epochs):
401
- ...
402
- ```
403
-
404
-
405
- Parameters
406
- ----------
407
- load_policy : str, default: "fresh"
408
- The policy for loading the checkpoint. The following policies are supported:
409
- - "eager": Loads the the latest available checkpoint within the namespace.
410
- With this mode, the latest checkpoint written by any previous task (can be even a different run) of the step
411
- will be loaded at the start of the task.
412
- - "none": Do not load any checkpoint
413
- - "fresh": Loads the lastest checkpoint created within the running Task.
414
- This mode helps loading checkpoints across various retry attempts of the same task.
415
- With this mode, no checkpoint will be loaded at the start of a task but any checkpoints
416
- created within the task will be loaded when the task is retries execution on failure.
417
-
418
- temp_dir_root : str, default: None
419
- The root directory under which `current.checkpoint.directory` will be created.
420
- """
421
- ...
422
-
423
- def s3_proxy(*, integration_name: typing.Optional[str] = None, write_mode: typing.Optional[str] = None, debug: typing.Optional[bool] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
424
- """
425
- S3 Proxy decorator for routing S3 requests through a local proxy service.
426
-
427
-
428
- Parameters
429
- ----------
430
- integration_name : str, optional
431
- Name of the S3 proxy integration. If not specified, will use the only
432
- available S3 proxy integration in the namespace (fails if multiple exist).
433
- write_mode : str, optional
434
- The desired behavior during write operations to target (origin) S3 bucket.
435
- allowed options are:
436
- "origin-and-cache" -> write to both the target S3 bucket and local object
437
- storage
438
- "origin" -> only write to the target S3 bucket
439
- "cache" -> only write to the object storage service used for caching
440
- debug : bool, optional
441
- Enable debug logging for proxy operations.
442
- """
443
- ...
444
-
445
- def vllm(*, model: str, backend: str, openai_api_server: bool, debug: bool, card_refresh_interval: int, max_retries: int, retry_alert_frequency: int, engine_args: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
446
- """
447
- This decorator is used to run vllm APIs as Metaflow task sidecars.
366
+ This decorator is used to run vllm APIs as Metaflow task sidecars.
448
367
 
449
368
  User code call
450
369
  --------------
@@ -493,53 +412,21 @@ def vllm(*, model: str, backend: str, openai_api_server: bool, debug: bool, card
493
412
  ...
494
413
 
495
414
  @typing.overload
496
- def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
415
+ def app_deploy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
497
416
  """
498
- Specifies that the step will success under all circumstances.
499
-
500
- The decorator will create an optional artifact, specified by `var`, which
501
- contains the exception raised. You can use it to detect the presence
502
- of errors, indicating that all happy-path artifacts produced by the step
503
- are missing.
504
-
505
-
506
- Parameters
507
- ----------
508
- var : str, optional, default None
509
- Name of the artifact in which to store the caught exception.
510
- If not specified, the exception is not stored.
511
- print_exception : bool, default True
512
- Determines whether or not the exception is printed to
513
- stdout when caught.
417
+ Decorator prototype for all step decorators. This function gets specialized
418
+ and imported for all decorators types by _import_plugin_decorators().
514
419
  """
515
420
  ...
516
421
 
517
422
  @typing.overload
518
- def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
519
- ...
520
-
521
- @typing.overload
522
- def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
423
+ def app_deploy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
523
424
  ...
524
425
 
525
- def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
426
+ def app_deploy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
526
427
  """
527
- Specifies that the step will success under all circumstances.
528
-
529
- The decorator will create an optional artifact, specified by `var`, which
530
- contains the exception raised. You can use it to detect the presence
531
- of errors, indicating that all happy-path artifacts produced by the step
532
- are missing.
533
-
534
-
535
- Parameters
536
- ----------
537
- var : str, optional, default None
538
- Name of the artifact in which to store the caught exception.
539
- If not specified, the exception is not stored.
540
- print_exception : bool, default True
541
- Determines whether or not the exception is printed to
542
- stdout when caught.
428
+ Decorator prototype for all step decorators. This function gets specialized
429
+ and imported for all decorators types by _import_plugin_decorators().
543
430
  """
544
431
  ...
545
432
 
@@ -582,109 +469,54 @@ def secrets(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None],
582
469
  """
583
470
  ...
584
471
 
585
- def nvct(*, gpu: int, gpu_type: str) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
586
- """
587
- Specifies that this step should execute on DGX cloud.
588
-
589
-
590
- Parameters
591
- ----------
592
- gpu : int
593
- Number of GPUs to use.
594
- gpu_type : str
595
- Type of Nvidia GPU to use.
596
- """
597
- ...
598
-
599
- def ollama(*, models: list, backend: str, force_pull: bool, cache_update_policy: str, force_cache_update: bool, debug: bool, circuit_breaker_config: dict, timeout_config: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
600
- """
601
- This decorator is used to run Ollama APIs as Metaflow task sidecars.
602
-
603
- User code call
604
- --------------
605
- @ollama(
606
- models=[...],
607
- ...
608
- )
609
-
610
- Valid backend options
611
- ---------------------
612
- - 'local': Run as a separate process on the local task machine.
613
- - (TODO) 'managed': Outerbounds hosts and selects compute provider.
614
- - (TODO) 'remote': Spin up separate instance to serve Ollama models.
615
-
616
- Valid model options
617
- -------------------
618
- Any model here https://ollama.com/search, e.g. 'llama3.2', 'llama3.3'
619
-
620
-
621
- Parameters
622
- ----------
623
- models: list[str]
624
- List of Ollama containers running models in sidecars.
625
- backend: str
626
- Determines where and how to run the Ollama process.
627
- force_pull: bool
628
- Whether to run `ollama pull` no matter what, or first check the remote cache in Metaflow datastore for this model key.
629
- cache_update_policy: str
630
- Cache update policy: "auto", "force", or "never".
631
- force_cache_update: bool
632
- Simple override for "force" cache update policy.
633
- debug: bool
634
- Whether to turn on verbose debugging logs.
635
- circuit_breaker_config: dict
636
- Configuration for circuit breaker protection. Keys: failure_threshold, recovery_timeout, reset_timeout.
637
- timeout_config: dict
638
- Configuration for various operation timeouts. Keys: pull, stop, health_check, install, server_startup.
639
- """
640
- ...
641
-
642
472
  @typing.overload
643
- def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
473
+ def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
644
474
  """
645
- Creates a human-readable report, a Metaflow Card, after this step completes.
475
+ Specifies the PyPI packages for the step.
646
476
 
647
- Note that you may add multiple `@card` decorators in a step with different parameters.
477
+ Information in this decorator will augment any
478
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
479
+ you can use `@pypi_base` to set packages required by all
480
+ steps and use `@pypi` to specify step-specific overrides.
648
481
 
649
482
 
650
483
  Parameters
651
484
  ----------
652
- type : str, default 'default'
653
- Card type.
654
- id : str, optional, default None
655
- If multiple cards are present, use this id to identify this card.
656
- options : Dict[str, Any], default {}
657
- Options passed to the card. The contents depend on the card type.
658
- timeout : int, default 45
659
- Interrupt reporting if it takes more than this many seconds.
485
+ packages : Dict[str, str], default: {}
486
+ Packages to use for this step. The key is the name of the package
487
+ and the value is the version to use.
488
+ python : str, optional, default: None
489
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
490
+ that the version used will correspond to the version of the Python interpreter used to start the run.
660
491
  """
661
492
  ...
662
493
 
663
494
  @typing.overload
664
- def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
495
+ def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
665
496
  ...
666
497
 
667
498
  @typing.overload
668
- def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
499
+ def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
669
500
  ...
670
501
 
671
- def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
502
+ def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
672
503
  """
673
- Creates a human-readable report, a Metaflow Card, after this step completes.
504
+ Specifies the PyPI packages for the step.
674
505
 
675
- Note that you may add multiple `@card` decorators in a step with different parameters.
506
+ Information in this decorator will augment any
507
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
508
+ you can use `@pypi_base` to set packages required by all
509
+ steps and use `@pypi` to specify step-specific overrides.
676
510
 
677
511
 
678
512
  Parameters
679
513
  ----------
680
- type : str, default 'default'
681
- Card type.
682
- id : str, optional, default None
683
- If multiple cards are present, use this id to identify this card.
684
- options : Dict[str, Any], default {}
685
- Options passed to the card. The contents depend on the card type.
686
- timeout : int, default 45
687
- Interrupt reporting if it takes more than this many seconds.
514
+ packages : Dict[str, str], default: {}
515
+ Packages to use for this step. The key is the name of the package
516
+ and the value is the version to use.
517
+ python : str, optional, default: None
518
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
519
+ that the version used will correspond to the version of the Python interpreter used to start the run.
688
520
  """
689
521
  ...
690
522
 
@@ -768,21 +600,78 @@ def resources(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None]
768
600
  ...
769
601
 
770
602
  @typing.overload
771
- def test_append_card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
772
- """
773
- A simple decorator that demonstrates using CardDecoratorInjector
774
- to inject a card and render simple markdown content.
603
+ def timeout(*, seconds: int = 0, minutes: int = 0, hours: int = 0) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
775
604
  """
776
- ...
777
-
778
- @typing.overload
779
- def test_append_card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
780
- ...
781
-
782
- def test_append_card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
605
+ Specifies a timeout for your step.
606
+
607
+ This decorator is useful if this step may hang indefinitely.
608
+
609
+ This can be used in conjunction with the `@retry` decorator as well as the `@catch` decorator.
610
+ A timeout is considered to be an exception thrown by the step. It will cause the step to be
611
+ retried if needed and the exception will be caught by the `@catch` decorator, if present.
612
+
613
+ Note that all the values specified in parameters are added together so if you specify
614
+ 60 seconds and 1 hour, the decorator will have an effective timeout of 1 hour and 1 minute.
615
+
616
+
617
+ Parameters
618
+ ----------
619
+ seconds : int, default 0
620
+ Number of seconds to wait prior to timing out.
621
+ minutes : int, default 0
622
+ Number of minutes to wait prior to timing out.
623
+ hours : int, default 0
624
+ Number of hours to wait prior to timing out.
783
625
  """
784
- A simple decorator that demonstrates using CardDecoratorInjector
785
- to inject a card and render simple markdown content.
626
+ ...
627
+
628
+ @typing.overload
629
+ def timeout(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
630
+ ...
631
+
632
+ @typing.overload
633
+ def timeout(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
634
+ ...
635
+
636
+ def timeout(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, seconds: int = 0, minutes: int = 0, hours: int = 0):
637
+ """
638
+ Specifies a timeout for your step.
639
+
640
+ This decorator is useful if this step may hang indefinitely.
641
+
642
+ This can be used in conjunction with the `@retry` decorator as well as the `@catch` decorator.
643
+ A timeout is considered to be an exception thrown by the step. It will cause the step to be
644
+ retried if needed and the exception will be caught by the `@catch` decorator, if present.
645
+
646
+ Note that all the values specified in parameters are added together so if you specify
647
+ 60 seconds and 1 hour, the decorator will have an effective timeout of 1 hour and 1 minute.
648
+
649
+
650
+ Parameters
651
+ ----------
652
+ seconds : int, default 0
653
+ Number of seconds to wait prior to timing out.
654
+ minutes : int, default 0
655
+ Number of minutes to wait prior to timing out.
656
+ hours : int, default 0
657
+ Number of hours to wait prior to timing out.
658
+ """
659
+ ...
660
+
661
+ @typing.overload
662
+ def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
663
+ """
664
+ Internal decorator to support Fast bakery
665
+ """
666
+ ...
667
+
668
+ @typing.overload
669
+ def fast_bakery_internal(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
670
+ ...
671
+
672
+ def fast_bakery_internal(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
673
+ """
674
+ Internal decorator to support Fast bakery
786
675
  """
787
676
  ...
788
677
 
@@ -841,175 +730,6 @@ def retry(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], ty
841
730
  """
842
731
  ...
843
732
 
844
- def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: typing.Optional[str] = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.Optional[typing.List[str]] = None, node_selector: typing.Union[typing.Dict[str, str], str, None] = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: typing.Optional[int] = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = '/metaflow_temp', persistent_volume_claims: typing.Optional[typing.Dict[str, str]] = None, shared_memory: typing.Optional[int] = None, port: typing.Optional[int] = None, compute_pool: typing.Optional[str] = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Optional[typing.Dict[str, typing.Any]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
845
- """
846
- Specifies that this step should execute on Kubernetes.
847
-
848
-
849
- Parameters
850
- ----------
851
- cpu : int, default 1
852
- Number of CPUs required for this step. If `@resources` is
853
- also present, the maximum value from all decorators is used.
854
- memory : int, default 4096
855
- Memory size (in MB) required for this step. If
856
- `@resources` is also present, the maximum value from all decorators is
857
- used.
858
- disk : int, default 10240
859
- Disk size (in MB) required for this step. If
860
- `@resources` is also present, the maximum value from all decorators is
861
- used.
862
- image : str, optional, default None
863
- Docker image to use when launching on Kubernetes. If not specified, and
864
- METAFLOW_KUBERNETES_CONTAINER_IMAGE is specified, that image is used. If
865
- not, a default Docker image mapping to the current version of Python is used.
866
- image_pull_policy: str, default KUBERNETES_IMAGE_PULL_POLICY
867
- If given, the imagePullPolicy to be applied to the Docker image of the step.
868
- image_pull_secrets: List[str], default []
869
- The default is extracted from METAFLOW_KUBERNETES_IMAGE_PULL_SECRETS.
870
- Kubernetes image pull secrets to use when pulling container images
871
- in Kubernetes.
872
- service_account : str, default METAFLOW_KUBERNETES_SERVICE_ACCOUNT
873
- Kubernetes service account to use when launching pod in Kubernetes.
874
- secrets : List[str], optional, default None
875
- Kubernetes secrets to use when launching pod in Kubernetes. These
876
- secrets are in addition to the ones defined in `METAFLOW_KUBERNETES_SECRETS`
877
- in Metaflow configuration.
878
- node_selector: Union[Dict[str,str], str], optional, default None
879
- Kubernetes node selector(s) to apply to the pod running the task.
880
- Can be passed in as a comma separated string of values e.g.
881
- 'kubernetes.io/os=linux,kubernetes.io/arch=amd64' or as a dictionary
882
- {'kubernetes.io/os': 'linux', 'kubernetes.io/arch': 'amd64'}
883
- namespace : str, default METAFLOW_KUBERNETES_NAMESPACE
884
- Kubernetes namespace to use when launching pod in Kubernetes.
885
- gpu : int, optional, default None
886
- Number of GPUs required for this step. A value of zero implies that
887
- the scheduled node should not have GPUs.
888
- gpu_vendor : str, default KUBERNETES_GPU_VENDOR
889
- The vendor of the GPUs to be used for this step.
890
- tolerations : List[Dict[str,str]], default []
891
- The default is extracted from METAFLOW_KUBERNETES_TOLERATIONS.
892
- Kubernetes tolerations to use when launching pod in Kubernetes.
893
- labels: Dict[str, str], default: METAFLOW_KUBERNETES_LABELS
894
- Kubernetes labels to use when launching pod in Kubernetes.
895
- annotations: Dict[str, str], default: METAFLOW_KUBERNETES_ANNOTATIONS
896
- Kubernetes annotations to use when launching pod in Kubernetes.
897
- use_tmpfs : bool, default False
898
- This enables an explicit tmpfs mount for this step.
899
- tmpfs_tempdir : bool, default True
900
- sets METAFLOW_TEMPDIR to tmpfs_path if set for this step.
901
- tmpfs_size : int, optional, default: None
902
- The value for the size (in MiB) of the tmpfs mount for this step.
903
- This parameter maps to the `--tmpfs` option in Docker. Defaults to 50% of the
904
- memory allocated for this step.
905
- tmpfs_path : str, optional, default /metaflow_temp
906
- Path to tmpfs mount for this step.
907
- persistent_volume_claims : Dict[str, str], optional, default None
908
- A map (dictionary) of persistent volumes to be mounted to the pod for this step. The map is from persistent
909
- volumes to the path to which the volume is to be mounted, e.g., `{'pvc-name': '/path/to/mount/on'}`.
910
- shared_memory: int, optional
911
- Shared memory size (in MiB) required for this step
912
- port: int, optional
913
- Port number to specify in the Kubernetes job object
914
- compute_pool : str, optional, default None
915
- Compute pool to be used for for this step.
916
- If not specified, any accessible compute pool within the perimeter is used.
917
- hostname_resolution_timeout: int, default 10 * 60
918
- Timeout in seconds for the workers tasks in the gang scheduled cluster to resolve the hostname of control task.
919
- Only applicable when @parallel is used.
920
- qos: str, default: Burstable
921
- Quality of Service class to assign to the pod. Supported values are: Guaranteed, Burstable, BestEffort
922
-
923
- security_context: Dict[str, Any], optional, default None
924
- Container security context. Applies to the task container. Allows the following keys:
925
- - privileged: bool, optional, default None
926
- - allow_privilege_escalation: bool, optional, default None
927
- - run_as_user: int, optional, default None
928
- - run_as_group: int, optional, default None
929
- - run_as_non_root: bool, optional, default None
930
- """
931
- ...
932
-
933
- @typing.overload
934
- def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
935
- """
936
- Specifies the Conda environment for the step.
937
-
938
- Information in this decorator will augment any
939
- attributes set in the `@conda_base` flow-level decorator. Hence,
940
- you can use `@conda_base` to set packages required by all
941
- steps and use `@conda` to specify step-specific overrides.
942
-
943
-
944
- Parameters
945
- ----------
946
- packages : Dict[str, str], default {}
947
- Packages to use for this step. The key is the name of the package
948
- and the value is the version to use.
949
- libraries : Dict[str, str], default {}
950
- Supported for backward compatibility. When used with packages, packages will take precedence.
951
- python : str, optional, default None
952
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
953
- that the version used will correspond to the version of the Python interpreter used to start the run.
954
- disabled : bool, default False
955
- If set to True, disables @conda.
956
- """
957
- ...
958
-
959
- @typing.overload
960
- def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
961
- ...
962
-
963
- @typing.overload
964
- def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
965
- ...
966
-
967
- def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
968
- """
969
- Specifies the Conda environment for the step.
970
-
971
- Information in this decorator will augment any
972
- attributes set in the `@conda_base` flow-level decorator. Hence,
973
- you can use `@conda_base` to set packages required by all
974
- steps and use `@conda` to specify step-specific overrides.
975
-
976
-
977
- Parameters
978
- ----------
979
- packages : Dict[str, str], default {}
980
- Packages to use for this step. The key is the name of the package
981
- and the value is the version to use.
982
- libraries : Dict[str, str], default {}
983
- Supported for backward compatibility. When used with packages, packages will take precedence.
984
- python : str, optional, default None
985
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
986
- that the version used will correspond to the version of the Python interpreter used to start the run.
987
- disabled : bool, default False
988
- If set to True, disables @conda.
989
- """
990
- ...
991
-
992
- @typing.overload
993
- def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
994
- """
995
- Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
996
- It exists to make it easier for users to know that this decorator should only be used with
997
- a Neo Cloud like Nebius.
998
- """
999
- ...
1000
-
1001
- @typing.overload
1002
- def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1003
- ...
1004
-
1005
- def nebius_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1006
- """
1007
- Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1008
- It exists to make it easier for users to know that this decorator should only be used with
1009
- a Neo Cloud like Nebius.
1010
- """
1011
- ...
1012
-
1013
733
  def huggingface_hub(*, temp_dir_root: typing.Optional[str] = None, load: typing.Union[typing.List[str], typing.List[typing.Tuple[typing.Dict, str]], typing.List[typing.Tuple[str, str]], typing.List[typing.Dict], None]) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1014
734
  """
1015
735
  Decorator that helps cache, version and store models/datasets from huggingface hub.
@@ -1090,19 +810,132 @@ def huggingface_hub(*, temp_dir_root: typing.Optional[str] = None, load: typing.
1090
810
  """
1091
811
  ...
1092
812
 
1093
- def nvidia(*, gpu: int, gpu_type: str, queue_timeout: int) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
813
+ @typing.overload
814
+ def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1094
815
  """
1095
- Specifies that this step should execute on DGX cloud.
816
+ Specifies the Conda environment for the step.
817
+
818
+ Information in this decorator will augment any
819
+ attributes set in the `@conda_base` flow-level decorator. Hence,
820
+ you can use `@conda_base` to set packages required by all
821
+ steps and use `@conda` to specify step-specific overrides.
1096
822
 
1097
823
 
1098
824
  Parameters
1099
825
  ----------
1100
- gpu : int
1101
- Number of GPUs to use.
1102
- gpu_type : str
1103
- Type of Nvidia GPU to use.
1104
- queue_timeout : int
1105
- Time to keep the job in NVCF's queue.
826
+ packages : Dict[str, str], default {}
827
+ Packages to use for this step. The key is the name of the package
828
+ and the value is the version to use.
829
+ libraries : Dict[str, str], default {}
830
+ Supported for backward compatibility. When used with packages, packages will take precedence.
831
+ python : str, optional, default None
832
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
833
+ that the version used will correspond to the version of the Python interpreter used to start the run.
834
+ disabled : bool, default False
835
+ If set to True, disables @conda.
836
+ """
837
+ ...
838
+
839
+ @typing.overload
840
+ def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
841
+ ...
842
+
843
+ @typing.overload
844
+ def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
845
+ ...
846
+
847
+ def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
848
+ """
849
+ Specifies the Conda environment for the step.
850
+
851
+ Information in this decorator will augment any
852
+ attributes set in the `@conda_base` flow-level decorator. Hence,
853
+ you can use `@conda_base` to set packages required by all
854
+ steps and use `@conda` to specify step-specific overrides.
855
+
856
+
857
+ Parameters
858
+ ----------
859
+ packages : Dict[str, str], default {}
860
+ Packages to use for this step. The key is the name of the package
861
+ and the value is the version to use.
862
+ libraries : Dict[str, str], default {}
863
+ Supported for backward compatibility. When used with packages, packages will take precedence.
864
+ python : str, optional, default None
865
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
866
+ that the version used will correspond to the version of the Python interpreter used to start the run.
867
+ disabled : bool, default False
868
+ If set to True, disables @conda.
869
+ """
870
+ ...
871
+
872
+ @typing.overload
873
+ def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
874
+ """
875
+ Creates a human-readable report, a Metaflow Card, after this step completes.
876
+
877
+ Note that you may add multiple `@card` decorators in a step with different parameters.
878
+
879
+
880
+ Parameters
881
+ ----------
882
+ type : str, default 'default'
883
+ Card type.
884
+ id : str, optional, default None
885
+ If multiple cards are present, use this id to identify this card.
886
+ options : Dict[str, Any], default {}
887
+ Options passed to the card. The contents depend on the card type.
888
+ timeout : int, default 45
889
+ Interrupt reporting if it takes more than this many seconds.
890
+ """
891
+ ...
892
+
893
+ @typing.overload
894
+ def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
895
+ ...
896
+
897
+ @typing.overload
898
+ def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
899
+ ...
900
+
901
+ def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
902
+ """
903
+ Creates a human-readable report, a Metaflow Card, after this step completes.
904
+
905
+ Note that you may add multiple `@card` decorators in a step with different parameters.
906
+
907
+
908
+ Parameters
909
+ ----------
910
+ type : str, default 'default'
911
+ Card type.
912
+ id : str, optional, default None
913
+ If multiple cards are present, use this id to identify this card.
914
+ options : Dict[str, Any], default {}
915
+ Options passed to the card. The contents depend on the card type.
916
+ timeout : int, default 45
917
+ Interrupt reporting if it takes more than this many seconds.
918
+ """
919
+ ...
920
+
921
+ @typing.overload
922
+ def coreweave_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
923
+ """
924
+ CoreWeave-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
925
+ It exists to make it easier for users to know that this decorator should only be used with
926
+ a Neo Cloud like CoreWeave.
927
+ """
928
+ ...
929
+
930
+ @typing.overload
931
+ def coreweave_s3_proxy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
932
+ ...
933
+
934
+ def coreweave_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
935
+ """
936
+ CoreWeave-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
937
+ It exists to make it easier for users to know that this decorator should only be used with
938
+ a Neo Cloud like CoreWeave.
1106
939
  """
1107
940
  ...
1108
941
 
@@ -1235,124 +1068,291 @@ def model(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], ty
1235
1068
  """
1236
1069
  ...
1237
1070
 
1238
- @typing.overload
1239
- def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1071
+ def nvct(*, gpu: int, gpu_type: str) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1240
1072
  """
1241
- Specifies environment variables to be set prior to the execution of a step.
1073
+ Specifies that this step should execute on DGX cloud.
1242
1074
 
1243
1075
 
1244
1076
  Parameters
1245
1077
  ----------
1246
- vars : Dict[str, str], default {}
1247
- Dictionary of environment variables to set.
1078
+ gpu : int
1079
+ Number of GPUs to use.
1080
+ gpu_type : str
1081
+ Type of Nvidia GPU to use.
1248
1082
  """
1249
1083
  ...
1250
1084
 
1251
1085
  @typing.overload
1252
- def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1086
+ def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1087
+ """
1088
+ Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1089
+ It exists to make it easier for users to know that this decorator should only be used with
1090
+ a Neo Cloud like Nebius.
1091
+ """
1253
1092
  ...
1254
1093
 
1255
1094
  @typing.overload
1256
- def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1095
+ def nebius_s3_proxy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1257
1096
  ...
1258
1097
 
1259
- def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
1098
+ def nebius_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1260
1099
  """
1261
- Specifies environment variables to be set prior to the execution of a step.
1262
-
1263
-
1264
- Parameters
1265
- ----------
1266
- vars : Dict[str, str], default {}
1267
- Dictionary of environment variables to set.
1100
+ Nebius-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1101
+ It exists to make it easier for users to know that this decorator should only be used with
1102
+ a Neo Cloud like Nebius.
1268
1103
  """
1269
1104
  ...
1270
1105
 
1271
- @typing.overload
1272
- def timeout(*, seconds: int = 0, minutes: int = 0, hours: int = 0) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1106
+ def s3_proxy(*, integration_name: typing.Optional[str] = None, write_mode: typing.Optional[str] = None, debug: typing.Optional[bool] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1273
1107
  """
1274
- Specifies a timeout for your step.
1275
-
1276
- This decorator is useful if this step may hang indefinitely.
1277
-
1278
- This can be used in conjunction with the `@retry` decorator as well as the `@catch` decorator.
1279
- A timeout is considered to be an exception thrown by the step. It will cause the step to be
1280
- retried if needed and the exception will be caught by the `@catch` decorator, if present.
1281
-
1282
- Note that all the values specified in parameters are added together so if you specify
1283
- 60 seconds and 1 hour, the decorator will have an effective timeout of 1 hour and 1 minute.
1108
+ S3 Proxy decorator for routing S3 requests through a local proxy service.
1284
1109
 
1285
1110
 
1286
1111
  Parameters
1287
1112
  ----------
1288
- seconds : int, default 0
1289
- Number of seconds to wait prior to timing out.
1290
- minutes : int, default 0
1291
- Number of minutes to wait prior to timing out.
1292
- hours : int, default 0
1293
- Number of hours to wait prior to timing out.
1113
+ integration_name : str, optional
1114
+ Name of the S3 proxy integration. If not specified, will use the only
1115
+ available S3 proxy integration in the namespace (fails if multiple exist).
1116
+ write_mode : str, optional
1117
+ The desired behavior during write operations to target (origin) S3 bucket.
1118
+ allowed options are:
1119
+ "origin-and-cache" -> write to both the target S3 bucket and local object
1120
+ storage
1121
+ "origin" -> only write to the target S3 bucket
1122
+ "cache" -> only write to the object storage service used for caching
1123
+ debug : bool, optional
1124
+ Enable debug logging for proxy operations.
1294
1125
  """
1295
1126
  ...
1296
1127
 
1297
- @typing.overload
1298
- def timeout(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1299
- ...
1300
-
1301
- @typing.overload
1302
- def timeout(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1303
- ...
1304
-
1305
- def timeout(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, seconds: int = 0, minutes: int = 0, hours: int = 0):
1128
+ def ollama(*, models: list, backend: str, force_pull: bool, cache_update_policy: str, force_cache_update: bool, debug: bool, circuit_breaker_config: dict, timeout_config: dict) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1306
1129
  """
1307
- Specifies a timeout for your step.
1130
+ This decorator is used to run Ollama APIs as Metaflow task sidecars.
1308
1131
 
1309
- This decorator is useful if this step may hang indefinitely.
1132
+ User code call
1133
+ --------------
1134
+ @ollama(
1135
+ models=[...],
1136
+ ...
1137
+ )
1310
1138
 
1311
- This can be used in conjunction with the `@retry` decorator as well as the `@catch` decorator.
1312
- A timeout is considered to be an exception thrown by the step. It will cause the step to be
1313
- retried if needed and the exception will be caught by the `@catch` decorator, if present.
1139
+ Valid backend options
1140
+ ---------------------
1141
+ - 'local': Run as a separate process on the local task machine.
1142
+ - (TODO) 'managed': Outerbounds hosts and selects compute provider.
1143
+ - (TODO) 'remote': Spin up separate instance to serve Ollama models.
1314
1144
 
1315
- Note that all the values specified in parameters are added together so if you specify
1316
- 60 seconds and 1 hour, the decorator will have an effective timeout of 1 hour and 1 minute.
1145
+ Valid model options
1146
+ -------------------
1147
+ Any model here https://ollama.com/search, e.g. 'llama3.2', 'llama3.3'
1317
1148
 
1318
1149
 
1319
1150
  Parameters
1320
1151
  ----------
1321
- seconds : int, default 0
1322
- Number of seconds to wait prior to timing out.
1323
- minutes : int, default 0
1324
- Number of minutes to wait prior to timing out.
1325
- hours : int, default 0
1326
- Number of hours to wait prior to timing out.
1327
- """
1328
- ...
1329
-
1330
- @typing.overload
1331
- def coreweave_s3_proxy(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1332
- """
1333
- CoreWeave-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1334
- It exists to make it easier for users to know that this decorator should only be used with
1335
- a Neo Cloud like CoreWeave.
1152
+ models: list[str]
1153
+ List of Ollama containers running models in sidecars.
1154
+ backend: str
1155
+ Determines where and how to run the Ollama process.
1156
+ force_pull: bool
1157
+ Whether to run `ollama pull` no matter what, or first check the remote cache in Metaflow datastore for this model key.
1158
+ cache_update_policy: str
1159
+ Cache update policy: "auto", "force", or "never".
1160
+ force_cache_update: bool
1161
+ Simple override for "force" cache update policy.
1162
+ debug: bool
1163
+ Whether to turn on verbose debugging logs.
1164
+ circuit_breaker_config: dict
1165
+ Configuration for circuit breaker protection. Keys: failure_threshold, recovery_timeout, reset_timeout.
1166
+ timeout_config: dict
1167
+ Configuration for various operation timeouts. Keys: pull, stop, health_check, install, server_startup.
1336
1168
  """
1337
1169
  ...
1338
1170
 
1339
1171
  @typing.overload
1340
- def coreweave_s3_proxy(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1341
- ...
1342
-
1343
- def coreweave_s3_proxy(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
1344
- """
1345
- CoreWeave-specific S3 Proxy decorator for routing S3 requests through a local proxy service.
1346
- It exists to make it easier for users to know that this decorator should only be used with
1347
- a Neo Cloud like CoreWeave.
1348
- """
1349
- ...
1350
-
1351
- def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1172
+ def checkpoint(*, load_policy: str = 'fresh', temp_dir_root: str = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1352
1173
  """
1353
- Specifies what flows belong to the same project.
1174
+ Enables checkpointing for a step.
1354
1175
 
1355
- A project-specific namespace is created for all flows that
1176
+ > Examples
1177
+
1178
+ - Saving Checkpoints
1179
+
1180
+ ```python
1181
+ @checkpoint
1182
+ @step
1183
+ def train(self):
1184
+ model = create_model(self.parameters, checkpoint_path = None)
1185
+ for i in range(self.epochs):
1186
+ # some training logic
1187
+ loss = model.train(self.dataset)
1188
+ if i % 10 == 0:
1189
+ model.save(
1190
+ current.checkpoint.directory,
1191
+ )
1192
+ # saves the contents of the `current.checkpoint.directory` as a checkpoint
1193
+ # and returns a reference dictionary to the checkpoint saved in the datastore
1194
+ self.latest_checkpoint = current.checkpoint.save(
1195
+ name="epoch_checkpoint",
1196
+ metadata={
1197
+ "epoch": i,
1198
+ "loss": loss,
1199
+ }
1200
+ )
1201
+ ```
1202
+
1203
+ - Using Loaded Checkpoints
1204
+
1205
+ ```python
1206
+ @retry(times=3)
1207
+ @checkpoint
1208
+ @step
1209
+ def train(self):
1210
+ # Assume that the task has restarted and the previous attempt of the task
1211
+ # saved a checkpoint
1212
+ checkpoint_path = None
1213
+ if current.checkpoint.is_loaded: # Check if a checkpoint is loaded
1214
+ print("Loaded checkpoint from the previous attempt")
1215
+ checkpoint_path = current.checkpoint.directory
1216
+
1217
+ model = create_model(self.parameters, checkpoint_path = checkpoint_path)
1218
+ for i in range(self.epochs):
1219
+ ...
1220
+ ```
1221
+
1222
+
1223
+ Parameters
1224
+ ----------
1225
+ load_policy : str, default: "fresh"
1226
+ The policy for loading the checkpoint. The following policies are supported:
1227
+ - "eager": Loads the the latest available checkpoint within the namespace.
1228
+ With this mode, the latest checkpoint written by any previous task (can be even a different run) of the step
1229
+ will be loaded at the start of the task.
1230
+ - "none": Do not load any checkpoint
1231
+ - "fresh": Loads the lastest checkpoint created within the running Task.
1232
+ This mode helps loading checkpoints across various retry attempts of the same task.
1233
+ With this mode, no checkpoint will be loaded at the start of a task but any checkpoints
1234
+ created within the task will be loaded when the task is retries execution on failure.
1235
+
1236
+ temp_dir_root : str, default: None
1237
+ The root directory under which `current.checkpoint.directory` will be created.
1238
+ """
1239
+ ...
1240
+
1241
+ @typing.overload
1242
+ def checkpoint(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1243
+ ...
1244
+
1245
+ @typing.overload
1246
+ def checkpoint(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1247
+ ...
1248
+
1249
+ def checkpoint(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, load_policy: str = 'fresh', temp_dir_root: str = None):
1250
+ """
1251
+ Enables checkpointing for a step.
1252
+
1253
+ > Examples
1254
+
1255
+ - Saving Checkpoints
1256
+
1257
+ ```python
1258
+ @checkpoint
1259
+ @step
1260
+ def train(self):
1261
+ model = create_model(self.parameters, checkpoint_path = None)
1262
+ for i in range(self.epochs):
1263
+ # some training logic
1264
+ loss = model.train(self.dataset)
1265
+ if i % 10 == 0:
1266
+ model.save(
1267
+ current.checkpoint.directory,
1268
+ )
1269
+ # saves the contents of the `current.checkpoint.directory` as a checkpoint
1270
+ # and returns a reference dictionary to the checkpoint saved in the datastore
1271
+ self.latest_checkpoint = current.checkpoint.save(
1272
+ name="epoch_checkpoint",
1273
+ metadata={
1274
+ "epoch": i,
1275
+ "loss": loss,
1276
+ }
1277
+ )
1278
+ ```
1279
+
1280
+ - Using Loaded Checkpoints
1281
+
1282
+ ```python
1283
+ @retry(times=3)
1284
+ @checkpoint
1285
+ @step
1286
+ def train(self):
1287
+ # Assume that the task has restarted and the previous attempt of the task
1288
+ # saved a checkpoint
1289
+ checkpoint_path = None
1290
+ if current.checkpoint.is_loaded: # Check if a checkpoint is loaded
1291
+ print("Loaded checkpoint from the previous attempt")
1292
+ checkpoint_path = current.checkpoint.directory
1293
+
1294
+ model = create_model(self.parameters, checkpoint_path = checkpoint_path)
1295
+ for i in range(self.epochs):
1296
+ ...
1297
+ ```
1298
+
1299
+
1300
+ Parameters
1301
+ ----------
1302
+ load_policy : str, default: "fresh"
1303
+ The policy for loading the checkpoint. The following policies are supported:
1304
+ - "eager": Loads the the latest available checkpoint within the namespace.
1305
+ With this mode, the latest checkpoint written by any previous task (can be even a different run) of the step
1306
+ will be loaded at the start of the task.
1307
+ - "none": Do not load any checkpoint
1308
+ - "fresh": Loads the lastest checkpoint created within the running Task.
1309
+ This mode helps loading checkpoints across various retry attempts of the same task.
1310
+ With this mode, no checkpoint will be loaded at the start of a task but any checkpoints
1311
+ created within the task will be loaded when the task is retries execution on failure.
1312
+
1313
+ temp_dir_root : str, default: None
1314
+ The root directory under which `current.checkpoint.directory` will be created.
1315
+ """
1316
+ ...
1317
+
1318
+ @typing.overload
1319
+ def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
1320
+ """
1321
+ Specifies environment variables to be set prior to the execution of a step.
1322
+
1323
+
1324
+ Parameters
1325
+ ----------
1326
+ vars : Dict[str, str], default {}
1327
+ Dictionary of environment variables to set.
1328
+ """
1329
+ ...
1330
+
1331
+ @typing.overload
1332
+ def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
1333
+ ...
1334
+
1335
+ @typing.overload
1336
+ def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
1337
+ ...
1338
+
1339
+ def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
1340
+ """
1341
+ Specifies environment variables to be set prior to the execution of a step.
1342
+
1343
+
1344
+ Parameters
1345
+ ----------
1346
+ vars : Dict[str, str], default {}
1347
+ Dictionary of environment variables to set.
1348
+ """
1349
+ ...
1350
+
1351
+ def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1352
+ """
1353
+ Specifies what flows belong to the same project.
1354
+
1355
+ A project-specific namespace is created for all flows that
1356
1356
  use the same `@project(name)`.
1357
1357
 
1358
1358
 
@@ -1384,53 +1384,53 @@ def project(*, name: str, branch: typing.Optional[str] = None, production: bool
1384
1384
  ...
1385
1385
 
1386
1386
  @typing.overload
1387
- def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1387
+ def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1388
1388
  """
1389
- Specifies the times when the flow should be run when running on a
1390
- production scheduler.
1389
+ Specifies the Conda environment for all steps of the flow.
1390
+
1391
+ Use `@conda_base` to set common libraries required by all
1392
+ steps and use `@conda` to specify step-specific additions.
1391
1393
 
1392
1394
 
1393
1395
  Parameters
1394
1396
  ----------
1395
- hourly : bool, default False
1396
- Run the workflow hourly.
1397
- daily : bool, default True
1398
- Run the workflow daily.
1399
- weekly : bool, default False
1400
- Run the workflow weekly.
1401
- cron : str, optional, default None
1402
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1403
- specified by this expression.
1404
- timezone : str, optional, default None
1405
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1406
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1397
+ packages : Dict[str, str], default {}
1398
+ Packages to use for this flow. The key is the name of the package
1399
+ and the value is the version to use.
1400
+ libraries : Dict[str, str], default {}
1401
+ Supported for backward compatibility. When used with packages, packages will take precedence.
1402
+ python : str, optional, default None
1403
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1404
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1405
+ disabled : bool, default False
1406
+ If set to True, disables Conda.
1407
1407
  """
1408
1408
  ...
1409
1409
 
1410
1410
  @typing.overload
1411
- def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1411
+ def conda_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1412
1412
  ...
1413
1413
 
1414
- def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1414
+ def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1415
1415
  """
1416
- Specifies the times when the flow should be run when running on a
1417
- production scheduler.
1416
+ Specifies the Conda environment for all steps of the flow.
1417
+
1418
+ Use `@conda_base` to set common libraries required by all
1419
+ steps and use `@conda` to specify step-specific additions.
1418
1420
 
1419
1421
 
1420
1422
  Parameters
1421
1423
  ----------
1422
- hourly : bool, default False
1423
- Run the workflow hourly.
1424
- daily : bool, default True
1425
- Run the workflow daily.
1426
- weekly : bool, default False
1427
- Run the workflow weekly.
1428
- cron : str, optional, default None
1429
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1430
- specified by this expression.
1431
- timezone : str, optional, default None
1432
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1433
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1424
+ packages : Dict[str, str], default {}
1425
+ Packages to use for this flow. The key is the name of the package
1426
+ and the value is the version to use.
1427
+ libraries : Dict[str, str], default {}
1428
+ Supported for backward compatibility. When used with packages, packages will take precedence.
1429
+ python : str, optional, default None
1430
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1431
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1432
+ disabled : bool, default False
1433
+ If set to True, disables Conda.
1434
1434
  """
1435
1435
  ...
1436
1436
 
@@ -1477,6 +1477,242 @@ def airflow_external_task_sensor(*, timeout: int, poke_interval: int, mode: str,
1477
1477
  """
1478
1478
  ...
1479
1479
 
1480
+ def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1481
+ """
1482
+ The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1483
+ before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1484
+ and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1485
+ added as a flow decorators. Adding more than one decorator will ensure that `start` step
1486
+ starts only after all sensors finish.
1487
+
1488
+
1489
+ Parameters
1490
+ ----------
1491
+ timeout : int
1492
+ Time, in seconds before the task times out and fails. (Default: 3600)
1493
+ poke_interval : int
1494
+ Time in seconds that the job should wait in between each try. (Default: 60)
1495
+ mode : str
1496
+ How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1497
+ exponential_backoff : bool
1498
+ allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1499
+ pool : str
1500
+ the slot pool this task should run in,
1501
+ slot pools are a way to limit concurrency for certain tasks. (Default:None)
1502
+ soft_fail : bool
1503
+ Set to true to mark the task as SKIPPED on failure. (Default: False)
1504
+ name : str
1505
+ Name of the sensor on Airflow
1506
+ description : str
1507
+ Description of sensor in the Airflow UI
1508
+ bucket_key : Union[str, List[str]]
1509
+ The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1510
+ When it's specified as a full s3:// url, please leave `bucket_name` as None
1511
+ bucket_name : str
1512
+ Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1513
+ When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1514
+ wildcard_match : bool
1515
+ whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1516
+ aws_conn_id : str
1517
+ a reference to the s3 connection on Airflow. (Default: None)
1518
+ verify : bool
1519
+ Whether or not to verify SSL certificates for S3 connection. (Default: None)
1520
+ """
1521
+ ...
1522
+
1523
+ @typing.overload
1524
+ def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1525
+ """
1526
+ Specifies the flow(s) that this flow depends on.
1527
+
1528
+ ```
1529
+ @trigger_on_finish(flow='FooFlow')
1530
+ ```
1531
+ or
1532
+ ```
1533
+ @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1534
+ ```
1535
+ This decorator respects the @project decorator and triggers the flow
1536
+ when upstream runs within the same namespace complete successfully
1537
+
1538
+ Additionally, you can specify project aware upstream flow dependencies
1539
+ by specifying the fully qualified project_flow_name.
1540
+ ```
1541
+ @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1542
+ ```
1543
+ or
1544
+ ```
1545
+ @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1546
+ ```
1547
+
1548
+ You can also specify just the project or project branch (other values will be
1549
+ inferred from the current project or project branch):
1550
+ ```
1551
+ @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1552
+ ```
1553
+
1554
+ Note that `branch` is typically one of:
1555
+ - `prod`
1556
+ - `user.bob`
1557
+ - `test.my_experiment`
1558
+ - `prod.staging`
1559
+
1560
+
1561
+ Parameters
1562
+ ----------
1563
+ flow : Union[str, Dict[str, str]], optional, default None
1564
+ Upstream flow dependency for this flow.
1565
+ flows : List[Union[str, Dict[str, str]]], default []
1566
+ Upstream flow dependencies for this flow.
1567
+ options : Dict[str, Any], default {}
1568
+ Backend-specific configuration for tuning eventing behavior.
1569
+ """
1570
+ ...
1571
+
1572
+ @typing.overload
1573
+ def trigger_on_finish(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1574
+ ...
1575
+
1576
+ def trigger_on_finish(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}):
1577
+ """
1578
+ Specifies the flow(s) that this flow depends on.
1579
+
1580
+ ```
1581
+ @trigger_on_finish(flow='FooFlow')
1582
+ ```
1583
+ or
1584
+ ```
1585
+ @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1586
+ ```
1587
+ This decorator respects the @project decorator and triggers the flow
1588
+ when upstream runs within the same namespace complete successfully
1589
+
1590
+ Additionally, you can specify project aware upstream flow dependencies
1591
+ by specifying the fully qualified project_flow_name.
1592
+ ```
1593
+ @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1594
+ ```
1595
+ or
1596
+ ```
1597
+ @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1598
+ ```
1599
+
1600
+ You can also specify just the project or project branch (other values will be
1601
+ inferred from the current project or project branch):
1602
+ ```
1603
+ @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1604
+ ```
1605
+
1606
+ Note that `branch` is typically one of:
1607
+ - `prod`
1608
+ - `user.bob`
1609
+ - `test.my_experiment`
1610
+ - `prod.staging`
1611
+
1612
+
1613
+ Parameters
1614
+ ----------
1615
+ flow : Union[str, Dict[str, str]], optional, default None
1616
+ Upstream flow dependency for this flow.
1617
+ flows : List[Union[str, Dict[str, str]]], default []
1618
+ Upstream flow dependencies for this flow.
1619
+ options : Dict[str, Any], default {}
1620
+ Backend-specific configuration for tuning eventing behavior.
1621
+ """
1622
+ ...
1623
+
1624
+ @typing.overload
1625
+ def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1626
+ """
1627
+ Specifies the times when the flow should be run when running on a
1628
+ production scheduler.
1629
+
1630
+
1631
+ Parameters
1632
+ ----------
1633
+ hourly : bool, default False
1634
+ Run the workflow hourly.
1635
+ daily : bool, default True
1636
+ Run the workflow daily.
1637
+ weekly : bool, default False
1638
+ Run the workflow weekly.
1639
+ cron : str, optional, default None
1640
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1641
+ specified by this expression.
1642
+ timezone : str, optional, default None
1643
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1644
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1645
+ """
1646
+ ...
1647
+
1648
+ @typing.overload
1649
+ def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1650
+ ...
1651
+
1652
+ def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1653
+ """
1654
+ Specifies the times when the flow should be run when running on a
1655
+ production scheduler.
1656
+
1657
+
1658
+ Parameters
1659
+ ----------
1660
+ hourly : bool, default False
1661
+ Run the workflow hourly.
1662
+ daily : bool, default True
1663
+ Run the workflow daily.
1664
+ weekly : bool, default False
1665
+ Run the workflow weekly.
1666
+ cron : str, optional, default None
1667
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1668
+ specified by this expression.
1669
+ timezone : str, optional, default None
1670
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1671
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1672
+ """
1673
+ ...
1674
+
1675
+ @typing.overload
1676
+ def pypi_base(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1677
+ """
1678
+ Specifies the PyPI packages for all steps of the flow.
1679
+
1680
+ Use `@pypi_base` to set common packages required by all
1681
+ steps and use `@pypi` to specify step-specific overrides.
1682
+
1683
+ Parameters
1684
+ ----------
1685
+ packages : Dict[str, str], default: {}
1686
+ Packages to use for this flow. The key is the name of the package
1687
+ and the value is the version to use.
1688
+ python : str, optional, default: None
1689
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1690
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1691
+ """
1692
+ ...
1693
+
1694
+ @typing.overload
1695
+ def pypi_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1696
+ ...
1697
+
1698
+ def pypi_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
1699
+ """
1700
+ Specifies the PyPI packages for all steps of the flow.
1701
+
1702
+ Use `@pypi_base` to set common packages required by all
1703
+ steps and use `@pypi` to specify step-specific overrides.
1704
+
1705
+ Parameters
1706
+ ----------
1707
+ packages : Dict[str, str], default: {}
1708
+ Packages to use for this flow. The key is the name of the package
1709
+ and the value is the version to use.
1710
+ python : str, optional, default: None
1711
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
1712
+ that the version used will correspond to the version of the Python interpreter used to start the run.
1713
+ """
1714
+ ...
1715
+
1480
1716
  @typing.overload
1481
1717
  def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1482
1718
  """
@@ -1570,98 +1806,6 @@ def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: t
1570
1806
  """
1571
1807
  ...
1572
1808
 
1573
- @typing.overload
1574
- def conda_base(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1575
- """
1576
- Specifies the Conda environment for all steps of the flow.
1577
-
1578
- Use `@conda_base` to set common libraries required by all
1579
- steps and use `@conda` to specify step-specific additions.
1580
-
1581
-
1582
- Parameters
1583
- ----------
1584
- packages : Dict[str, str], default {}
1585
- Packages to use for this flow. The key is the name of the package
1586
- and the value is the version to use.
1587
- libraries : Dict[str, str], default {}
1588
- Supported for backward compatibility. When used with packages, packages will take precedence.
1589
- python : str, optional, default None
1590
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1591
- that the version used will correspond to the version of the Python interpreter used to start the run.
1592
- disabled : bool, default False
1593
- If set to True, disables Conda.
1594
- """
1595
- ...
1596
-
1597
- @typing.overload
1598
- def conda_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1599
- ...
1600
-
1601
- def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
1602
- """
1603
- Specifies the Conda environment for all steps of the flow.
1604
-
1605
- Use `@conda_base` to set common libraries required by all
1606
- steps and use `@conda` to specify step-specific additions.
1607
-
1608
-
1609
- Parameters
1610
- ----------
1611
- packages : Dict[str, str], default {}
1612
- Packages to use for this flow. The key is the name of the package
1613
- and the value is the version to use.
1614
- libraries : Dict[str, str], default {}
1615
- Supported for backward compatibility. When used with packages, packages will take precedence.
1616
- python : str, optional, default None
1617
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1618
- that the version used will correspond to the version of the Python interpreter used to start the run.
1619
- disabled : bool, default False
1620
- If set to True, disables Conda.
1621
- """
1622
- ...
1623
-
1624
- @typing.overload
1625
- def pypi_base(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1626
- """
1627
- Specifies the PyPI packages for all steps of the flow.
1628
-
1629
- Use `@pypi_base` to set common packages required by all
1630
- steps and use `@pypi` to specify step-specific overrides.
1631
-
1632
- Parameters
1633
- ----------
1634
- packages : Dict[str, str], default: {}
1635
- Packages to use for this flow. The key is the name of the package
1636
- and the value is the version to use.
1637
- python : str, optional, default: None
1638
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1639
- that the version used will correspond to the version of the Python interpreter used to start the run.
1640
- """
1641
- ...
1642
-
1643
- @typing.overload
1644
- def pypi_base(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1645
- ...
1646
-
1647
- def pypi_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
1648
- """
1649
- Specifies the PyPI packages for all steps of the flow.
1650
-
1651
- Use `@pypi_base` to set common packages required by all
1652
- steps and use `@pypi` to specify step-specific overrides.
1653
-
1654
- Parameters
1655
- ----------
1656
- packages : Dict[str, str], default: {}
1657
- Packages to use for this flow. The key is the name of the package
1658
- and the value is the version to use.
1659
- python : str, optional, default: None
1660
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
1661
- that the version used will correspond to the version of the Python interpreter used to start the run.
1662
- """
1663
- ...
1664
-
1665
1809
  def with_artifact_store(f: typing.Optional[typing.Type[FlowSpecDerived]] = None):
1666
1810
  """
1667
1811
  Allows setting external datastores to save data for the
@@ -1776,149 +1920,5 @@ def with_artifact_store(f: typing.Optional[typing.Type[FlowSpecDerived]] = None)
1776
1920
  """
1777
1921
  ...
1778
1922
 
1779
- @typing.overload
1780
- def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1781
- """
1782
- Specifies the flow(s) that this flow depends on.
1783
-
1784
- ```
1785
- @trigger_on_finish(flow='FooFlow')
1786
- ```
1787
- or
1788
- ```
1789
- @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1790
- ```
1791
- This decorator respects the @project decorator and triggers the flow
1792
- when upstream runs within the same namespace complete successfully
1793
-
1794
- Additionally, you can specify project aware upstream flow dependencies
1795
- by specifying the fully qualified project_flow_name.
1796
- ```
1797
- @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1798
- ```
1799
- or
1800
- ```
1801
- @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1802
- ```
1803
-
1804
- You can also specify just the project or project branch (other values will be
1805
- inferred from the current project or project branch):
1806
- ```
1807
- @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1808
- ```
1809
-
1810
- Note that `branch` is typically one of:
1811
- - `prod`
1812
- - `user.bob`
1813
- - `test.my_experiment`
1814
- - `prod.staging`
1815
-
1816
-
1817
- Parameters
1818
- ----------
1819
- flow : Union[str, Dict[str, str]], optional, default None
1820
- Upstream flow dependency for this flow.
1821
- flows : List[Union[str, Dict[str, str]]], default []
1822
- Upstream flow dependencies for this flow.
1823
- options : Dict[str, Any], default {}
1824
- Backend-specific configuration for tuning eventing behavior.
1825
- """
1826
- ...
1827
-
1828
- @typing.overload
1829
- def trigger_on_finish(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1830
- ...
1831
-
1832
- def trigger_on_finish(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}):
1833
- """
1834
- Specifies the flow(s) that this flow depends on.
1835
-
1836
- ```
1837
- @trigger_on_finish(flow='FooFlow')
1838
- ```
1839
- or
1840
- ```
1841
- @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1842
- ```
1843
- This decorator respects the @project decorator and triggers the flow
1844
- when upstream runs within the same namespace complete successfully
1845
-
1846
- Additionally, you can specify project aware upstream flow dependencies
1847
- by specifying the fully qualified project_flow_name.
1848
- ```
1849
- @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1850
- ```
1851
- or
1852
- ```
1853
- @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1854
- ```
1855
-
1856
- You can also specify just the project or project branch (other values will be
1857
- inferred from the current project or project branch):
1858
- ```
1859
- @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1860
- ```
1861
-
1862
- Note that `branch` is typically one of:
1863
- - `prod`
1864
- - `user.bob`
1865
- - `test.my_experiment`
1866
- - `prod.staging`
1867
-
1868
-
1869
- Parameters
1870
- ----------
1871
- flow : Union[str, Dict[str, str]], optional, default None
1872
- Upstream flow dependency for this flow.
1873
- flows : List[Union[str, Dict[str, str]]], default []
1874
- Upstream flow dependencies for this flow.
1875
- options : Dict[str, Any], default {}
1876
- Backend-specific configuration for tuning eventing behavior.
1877
- """
1878
- ...
1879
-
1880
- def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1881
- """
1882
- The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1883
- before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1884
- and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1885
- added as a flow decorators. Adding more than one decorator will ensure that `start` step
1886
- starts only after all sensors finish.
1887
-
1888
-
1889
- Parameters
1890
- ----------
1891
- timeout : int
1892
- Time, in seconds before the task times out and fails. (Default: 3600)
1893
- poke_interval : int
1894
- Time in seconds that the job should wait in between each try. (Default: 60)
1895
- mode : str
1896
- How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1897
- exponential_backoff : bool
1898
- allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1899
- pool : str
1900
- the slot pool this task should run in,
1901
- slot pools are a way to limit concurrency for certain tasks. (Default:None)
1902
- soft_fail : bool
1903
- Set to true to mark the task as SKIPPED on failure. (Default: False)
1904
- name : str
1905
- Name of the sensor on Airflow
1906
- description : str
1907
- Description of sensor in the Airflow UI
1908
- bucket_key : Union[str, List[str]]
1909
- The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1910
- When it's specified as a full s3:// url, please leave `bucket_name` as None
1911
- bucket_name : str
1912
- Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1913
- When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1914
- wildcard_match : bool
1915
- whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1916
- aws_conn_id : str
1917
- a reference to the s3 connection on Airflow. (Default: None)
1918
- verify : bool
1919
- Whether or not to verify SSL certificates for S3 connection. (Default: None)
1920
- """
1921
- ...
1922
-
1923
1923
  pkg_name: str
1924
1924