autogluon.timeseries 1.4.1b20251115__py3-none-any.whl → 1.4.1b20251218__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of autogluon.timeseries might be problematic. Click here for more details.

Files changed (82) hide show
  1. autogluon/timeseries/configs/hyperparameter_presets.py +7 -21
  2. autogluon/timeseries/configs/predictor_presets.py +23 -39
  3. autogluon/timeseries/dataset/ts_dataframe.py +32 -34
  4. autogluon/timeseries/learner.py +67 -33
  5. autogluon/timeseries/metrics/__init__.py +4 -4
  6. autogluon/timeseries/metrics/abstract.py +8 -8
  7. autogluon/timeseries/metrics/point.py +9 -9
  8. autogluon/timeseries/metrics/quantile.py +4 -4
  9. autogluon/timeseries/models/__init__.py +2 -1
  10. autogluon/timeseries/models/abstract/abstract_timeseries_model.py +52 -39
  11. autogluon/timeseries/models/abstract/model_trial.py +2 -1
  12. autogluon/timeseries/models/abstract/tunable.py +8 -8
  13. autogluon/timeseries/models/autogluon_tabular/mlforecast.py +30 -26
  14. autogluon/timeseries/models/autogluon_tabular/per_step.py +12 -10
  15. autogluon/timeseries/models/autogluon_tabular/transforms.py +2 -2
  16. autogluon/timeseries/models/chronos/__init__.py +2 -1
  17. autogluon/timeseries/models/chronos/chronos2.py +395 -0
  18. autogluon/timeseries/models/chronos/model.py +29 -24
  19. autogluon/timeseries/models/chronos/utils.py +5 -5
  20. autogluon/timeseries/models/ensemble/__init__.py +17 -10
  21. autogluon/timeseries/models/ensemble/abstract.py +13 -9
  22. autogluon/timeseries/models/ensemble/array_based/__init__.py +2 -2
  23. autogluon/timeseries/models/ensemble/array_based/abstract.py +24 -31
  24. autogluon/timeseries/models/ensemble/array_based/models.py +146 -11
  25. autogluon/timeseries/models/ensemble/array_based/regressor/__init__.py +2 -0
  26. autogluon/timeseries/models/ensemble/array_based/regressor/abstract.py +6 -5
  27. autogluon/timeseries/models/ensemble/array_based/regressor/linear_stacker.py +186 -0
  28. autogluon/timeseries/models/ensemble/array_based/regressor/per_quantile_tabular.py +44 -83
  29. autogluon/timeseries/models/ensemble/array_based/regressor/tabular.py +21 -55
  30. autogluon/timeseries/models/ensemble/ensemble_selection.py +167 -0
  31. autogluon/timeseries/models/ensemble/per_item_greedy.py +172 -0
  32. autogluon/timeseries/models/ensemble/weighted/abstract.py +7 -3
  33. autogluon/timeseries/models/ensemble/weighted/basic.py +26 -13
  34. autogluon/timeseries/models/ensemble/weighted/greedy.py +20 -145
  35. autogluon/timeseries/models/gluonts/abstract.py +30 -29
  36. autogluon/timeseries/models/gluonts/dataset.py +9 -9
  37. autogluon/timeseries/models/gluonts/models.py +0 -7
  38. autogluon/timeseries/models/local/__init__.py +0 -7
  39. autogluon/timeseries/models/local/abstract_local_model.py +13 -16
  40. autogluon/timeseries/models/local/naive.py +2 -2
  41. autogluon/timeseries/models/local/npts.py +7 -1
  42. autogluon/timeseries/models/local/statsforecast.py +12 -12
  43. autogluon/timeseries/models/multi_window/multi_window_model.py +38 -23
  44. autogluon/timeseries/models/registry.py +3 -4
  45. autogluon/timeseries/models/toto/_internal/backbone/attention.py +3 -4
  46. autogluon/timeseries/models/toto/_internal/backbone/backbone.py +6 -6
  47. autogluon/timeseries/models/toto/_internal/backbone/rope.py +4 -9
  48. autogluon/timeseries/models/toto/_internal/backbone/rotary_embedding_torch.py +342 -0
  49. autogluon/timeseries/models/toto/_internal/backbone/scaler.py +2 -3
  50. autogluon/timeseries/models/toto/_internal/backbone/transformer.py +10 -10
  51. autogluon/timeseries/models/toto/_internal/dataset.py +2 -2
  52. autogluon/timeseries/models/toto/_internal/forecaster.py +8 -8
  53. autogluon/timeseries/models/toto/dataloader.py +4 -4
  54. autogluon/timeseries/models/toto/hf_pretrained_model.py +97 -16
  55. autogluon/timeseries/models/toto/model.py +30 -17
  56. autogluon/timeseries/predictor.py +517 -129
  57. autogluon/timeseries/regressor.py +18 -23
  58. autogluon/timeseries/splitter.py +2 -2
  59. autogluon/timeseries/trainer/ensemble_composer.py +323 -129
  60. autogluon/timeseries/trainer/model_set_builder.py +9 -9
  61. autogluon/timeseries/trainer/prediction_cache.py +16 -16
  62. autogluon/timeseries/trainer/trainer.py +235 -144
  63. autogluon/timeseries/trainer/utils.py +3 -4
  64. autogluon/timeseries/transforms/covariate_scaler.py +7 -7
  65. autogluon/timeseries/transforms/target_scaler.py +8 -8
  66. autogluon/timeseries/utils/constants.py +10 -0
  67. autogluon/timeseries/utils/datetime/lags.py +1 -3
  68. autogluon/timeseries/utils/datetime/seasonality.py +1 -3
  69. autogluon/timeseries/utils/features.py +22 -9
  70. autogluon/timeseries/utils/forecast.py +1 -2
  71. autogluon/timeseries/utils/timer.py +173 -0
  72. autogluon/timeseries/version.py +1 -1
  73. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/METADATA +23 -21
  74. autogluon_timeseries-1.4.1b20251218.dist-info/RECORD +103 -0
  75. autogluon_timeseries-1.4.1b20251115.dist-info/RECORD +0 -96
  76. /autogluon.timeseries-1.4.1b20251115-py3.9-nspkg.pth → /autogluon.timeseries-1.4.1b20251218-py3.11-nspkg.pth +0 -0
  77. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/WHEEL +0 -0
  78. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/licenses/LICENSE +0 -0
  79. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/licenses/NOTICE +0 -0
  80. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/namespace_packages.txt +0 -0
  81. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/top_level.txt +0 -0
  82. {autogluon_timeseries-1.4.1b20251115.dist-info → autogluon_timeseries-1.4.1b20251218.dist-info}/zip-safe +0 -0
@@ -1,13 +1,12 @@
1
1
  import logging
2
- from typing import Optional
3
2
 
4
3
  logger = logging.getLogger("autogluon.timeseries.trainer")
5
4
 
6
5
 
7
6
  def log_scores_and_times(
8
- val_score: Optional[float],
9
- fit_time: Optional[float],
10
- predict_time: Optional[float],
7
+ val_score: float | None,
8
+ fit_time: float | None,
9
+ predict_time: float | None,
11
10
  eval_metric_name: str,
12
11
  ):
13
12
  if val_score is not None:
@@ -1,5 +1,5 @@
1
1
  import logging
2
- from typing import Literal, Optional, Protocol, overload, runtime_checkable
2
+ from typing import Literal, Protocol, overload, runtime_checkable
3
3
 
4
4
  import numpy as np
5
5
  import pandas as pd
@@ -25,8 +25,8 @@ class CovariateScaler(Protocol):
25
25
  def transform(self, data: TimeSeriesDataFrame) -> TimeSeriesDataFrame: ...
26
26
 
27
27
  def transform_known_covariates(
28
- self, known_covariates: Optional[TimeSeriesDataFrame] = None
29
- ) -> Optional[TimeSeriesDataFrame]: ...
28
+ self, known_covariates: TimeSeriesDataFrame | None = None
29
+ ) -> TimeSeriesDataFrame | None: ...
30
30
 
31
31
 
32
32
  class GlobalCovariateScaler(CovariateScaler):
@@ -53,7 +53,7 @@ class GlobalCovariateScaler(CovariateScaler):
53
53
  self.use_past_covariates = use_past_covariates
54
54
  self.use_static_features = use_static_features
55
55
  self.skew_threshold = skew_threshold
56
- self._column_transformers: Optional[dict[Literal["known", "past", "static"], ColumnTransformer]] = None
56
+ self._column_transformers: dict[Literal["known", "past", "static"], ColumnTransformer] | None = None
57
57
 
58
58
  def is_fit(self) -> bool:
59
59
  return self._column_transformers is not None
@@ -105,8 +105,8 @@ class GlobalCovariateScaler(CovariateScaler):
105
105
  return data
106
106
 
107
107
  def transform_known_covariates(
108
- self, known_covariates: Optional[TimeSeriesDataFrame] = None
109
- ) -> Optional[TimeSeriesDataFrame]:
108
+ self, known_covariates: TimeSeriesDataFrame | None = None
109
+ ) -> TimeSeriesDataFrame | None:
110
110
  assert self._column_transformers is not None, "CovariateScaler must be fit before transform can be called"
111
111
 
112
112
  if "known" in self._column_transformers:
@@ -154,7 +154,7 @@ AVAILABLE_COVARIATE_SCALERS = {
154
154
  def get_covariate_scaler(name: None, **scaler_kwargs) -> None: ...
155
155
  @overload
156
156
  def get_covariate_scaler(name: Literal["global"], **scaler_kwargs) -> GlobalCovariateScaler: ...
157
- def get_covariate_scaler(name: Optional[Literal["global"]] = None, **scaler_kwargs) -> Optional[CovariateScaler]:
157
+ def get_covariate_scaler(name: Literal["global"] | None = None, **scaler_kwargs) -> CovariateScaler | None:
158
158
  if name is None:
159
159
  return None
160
160
  if name not in AVAILABLE_COVARIATE_SCALERS:
@@ -1,4 +1,4 @@
1
- from typing import Literal, Optional, Protocol, Union, overload
1
+ from typing import Literal, Protocol, overload
2
2
 
3
3
  import numpy as np
4
4
  import pandas as pd
@@ -27,10 +27,10 @@ class LocalTargetScaler(TargetScaler):
27
27
  ):
28
28
  self.target = target
29
29
  self.min_scale = min_scale
30
- self.loc: Optional[pd.Series] = None
31
- self.scale: Optional[pd.Series] = None
30
+ self.loc: pd.Series | None = None
31
+ self.scale: pd.Series | None = None
32
32
 
33
- def _compute_loc_scale(self, target_series: pd.Series) -> tuple[Optional[pd.Series], Optional[pd.Series]]:
33
+ def _compute_loc_scale(self, target_series: pd.Series) -> tuple[pd.Series | None, pd.Series | None]:
34
34
  raise NotImplementedError
35
35
 
36
36
  def fit_transform(self, data: TimeSeriesDataFrame) -> TimeSeriesDataFrame:
@@ -45,7 +45,7 @@ class LocalTargetScaler(TargetScaler):
45
45
  self.scale = self.scale.clip(lower=self.min_scale).replace([np.inf, -np.inf], np.nan).fillna(1.0)
46
46
  return self
47
47
 
48
- def _reindex_loc_scale(self, item_index: pd.Index) -> tuple[Union[np.ndarray, float], Union[np.ndarray, float]]:
48
+ def _reindex_loc_scale(self, item_index: pd.Index) -> tuple[np.ndarray | float, np.ndarray | float]:
49
49
  """Reindex loc and scale parameters for the given item_ids and convert them to an array-like."""
50
50
  if self.loc is not None:
51
51
  loc = self.loc.reindex(item_index).to_numpy()
@@ -82,7 +82,7 @@ class LocalStandardScaler(LocalTargetScaler):
82
82
  class LocalMeanAbsScaler(LocalTargetScaler):
83
83
  """Applies mean absolute scaling to each time series in the dataset."""
84
84
 
85
- def _compute_loc_scale(self, target_series: pd.Series) -> tuple[Optional[pd.Series], pd.Series]:
85
+ def _compute_loc_scale(self, target_series: pd.Series) -> tuple[pd.Series | None, pd.Series]:
86
86
  scale = target_series.abs().groupby(level=TimeSeriesDataFrame.ITEMID, sort=False).agg("mean")
87
87
  return None, scale
88
88
 
@@ -139,8 +139,8 @@ def get_target_scaler(name: None, **scaler_kwargs) -> None: ...
139
139
  @overload
140
140
  def get_target_scaler(name: Literal["standard", "mean_abs", "min_max", "robust"], **scaler_kwargs) -> TargetScaler: ...
141
141
  def get_target_scaler(
142
- name: Optional[Literal["standard", "mean_abs", "min_max", "robust"]], **scaler_kwargs
143
- ) -> Optional[TargetScaler]:
142
+ name: Literal["standard", "mean_abs", "min_max", "robust"] | None, **scaler_kwargs
143
+ ) -> TargetScaler | None:
144
144
  """Get LocalTargetScaler object from a string."""
145
145
  if name is None:
146
146
  return None
@@ -0,0 +1,10 @@
1
+ import joblib.externals.loky
2
+ from joblib import cpu_count
3
+
4
+ # By default, joblib w/ loky backend kills processes that take >300MB of RAM assuming that this is caused by a memory
5
+ # leak. This leads to problems for some memory-hungry models like AutoARIMA/Theta.
6
+ # This monkey patch removes this undesired behavior
7
+ joblib.externals.loky.process_executor._MAX_MEMORY_LEAK_SIZE = int(3e10)
8
+
9
+ # We use the same default n_jobs across AG-TS to ensure that Joblib reuses the process pool
10
+ AG_DEFAULT_N_JOBS = max(cpu_count(only_physical_cores=True), 1)
@@ -2,8 +2,6 @@
2
2
  Generate lag indices based on frequency string. Adapted from gluonts.time_feature.lag.
3
3
  """
4
4
 
5
- from typing import Optional
6
-
7
5
  import numpy as np
8
6
  import pandas as pd
9
7
 
@@ -70,7 +68,7 @@ def _make_lags_for_semi_month(multiple, num_cycles=3):
70
68
  def get_lags_for_frequency(
71
69
  freq: str,
72
70
  lag_ub: int = 1200,
73
- num_lags: Optional[int] = None,
71
+ num_lags: int | None = None,
74
72
  num_default_lags: int = 7,
75
73
  ) -> list[int]:
76
74
  """
@@ -1,5 +1,3 @@
1
- from typing import Union
2
-
3
1
  import pandas as pd
4
2
 
5
3
  from .base import norm_freq_str
@@ -22,7 +20,7 @@ DEFAULT_SEASONALITIES = {
22
20
  }
23
21
 
24
22
 
25
- def get_seasonality(freq: Union[str, None]) -> int:
23
+ def get_seasonality(freq: str | None) -> int:
26
24
  """Return the seasonality of a given frequency. Adapted from ``gluonts.time_feature.seasonality``."""
27
25
  if freq is None:
28
26
  return 1
@@ -2,7 +2,7 @@ import logging
2
2
  import reprlib
3
3
  import time
4
4
  from dataclasses import asdict, dataclass, field
5
- from typing import Any, Literal, Optional
5
+ from typing import Any, Literal
6
6
 
7
7
  import numpy as np
8
8
  import pandas as pd
@@ -30,6 +30,14 @@ class CovariateMetadata:
30
30
  known_covariates_cat: list[str] = field(default_factory=list)
31
31
  past_covariates_real: list[str] = field(default_factory=list)
32
32
  past_covariates_cat: list[str] = field(default_factory=list)
33
+ static_cat_cardinality: dict[str, int] = field(default_factory=dict)
34
+ known_cat_cardinality: dict[str, int] = field(default_factory=dict)
35
+ past_cat_cardinality: dict[str, int] = field(default_factory=dict)
36
+
37
+ def __post_init__(self):
38
+ assert list(self.static_cat_cardinality.keys()) == self.static_features_cat
39
+ assert list(self.known_cat_cardinality.keys()) == self.known_covariates_cat
40
+ assert list(self.past_cat_cardinality.keys()) == self.past_covariates_cat
33
41
 
34
42
  @property
35
43
  def static_features(self) -> list[str]:
@@ -136,7 +144,7 @@ class TimeSeriesFeatureGenerator:
136
144
  target: str,
137
145
  known_covariates_names: list[str],
138
146
  float_dtype: str = "float32",
139
- num_samples: Optional[int] = 20_000,
147
+ num_samples: int | None = 20_000,
140
148
  ):
141
149
  self.target = target
142
150
  self.float_dtype = float_dtype
@@ -149,9 +157,9 @@ class TimeSeriesFeatureGenerator:
149
157
  self.past_covariates_pipeline = ContinuousAndCategoricalFeatureGenerator()
150
158
  # Cat features with cat_count=1 are fine in static_features since they are repeated for all time steps in a TS
151
159
  self.static_feature_pipeline = ContinuousAndCategoricalFeatureGenerator(minimum_cat_count=1)
152
- self._covariate_metadata: Optional[CovariateMetadata] = None # type ignore
153
- self._train_covariates_real_median: Optional[pd.Series] = None
154
- self._train_static_real_median: Optional[pd.Series] = None
160
+ self._covariate_metadata: CovariateMetadata | None = None # type ignore
161
+ self._train_covariates_real_median: pd.Series | None = None
162
+ self._train_static_real_median: pd.Series | None = None
155
163
 
156
164
  @property
157
165
  def required_column_names(self) -> list[str]:
@@ -221,11 +229,13 @@ class TimeSeriesFeatureGenerator:
221
229
  static_features_cat, static_features_real = self._detect_and_log_column_types(static_features_df)
222
230
  ignored_static_features = data.static_features.columns.difference(self.static_feature_pipeline.features_in)
223
231
  self._train_static_real_median = data.static_features[static_features_real].median()
232
+ static_cat_cardinality = static_features_df[static_features_cat].nunique().to_dict()
224
233
  else:
225
234
  static_features_cat = []
226
235
  static_features_real = []
227
236
  ignored_static_features = []
228
237
  static_features_df = None
238
+ static_cat_cardinality = {}
229
239
 
230
240
  if len(ignored_covariates) > 0 or len(ignored_static_features) > 0:
231
241
  logger.info("\nAutoGluon will ignore following non-numeric/non-informative columns:")
@@ -246,6 +256,9 @@ class TimeSeriesFeatureGenerator:
246
256
  past_covariates_real=past_covariates_real,
247
257
  static_features_cat=static_features_cat,
248
258
  static_features_real=static_features_real,
259
+ static_cat_cardinality=static_cat_cardinality,
260
+ known_cat_cardinality=df[known_covariates_cat].nunique().to_dict(),
261
+ past_cat_cardinality=df[past_covariates_cat].nunique().to_dict(),
249
262
  )
250
263
 
251
264
  # Median of real-valued covariates will be used for missing value imputation
@@ -279,7 +292,7 @@ class TimeSeriesFeatureGenerator:
279
292
  ts_df[column_names] = covariates_real
280
293
  return ts_df
281
294
 
282
- def _impute_static_features(self, static_df: Optional[pd.DataFrame]) -> Optional[pd.DataFrame]:
295
+ def _impute_static_features(self, static_df: pd.DataFrame | None) -> pd.DataFrame | None:
283
296
  """Impute missing values in static features using the median."""
284
297
  static_real_names = self.covariate_metadata.static_features_real
285
298
  if static_df is not None and static_real_names:
@@ -328,8 +341,8 @@ class TimeSeriesFeatureGenerator:
328
341
  return self._impute_covariates(ts_df, column_names=self.covariate_metadata.covariates_real)
329
342
 
330
343
  def transform_future_known_covariates(
331
- self, known_covariates: Optional[TimeSeriesDataFrame]
332
- ) -> Optional[TimeSeriesDataFrame]:
344
+ self, known_covariates: TimeSeriesDataFrame | None
345
+ ) -> TimeSeriesDataFrame | None:
333
346
  assert self._is_fit, f"{self.__class__.__name__} has not been fit yet"
334
347
  if len(self.known_covariates_names) > 0:
335
348
  assert known_covariates is not None, "known_covariates must be provided at prediction time"
@@ -441,7 +454,7 @@ class PermutationFeatureImportanceTransform(AbstractFeatureImportanceTransform):
441
454
  self,
442
455
  covariate_metadata: CovariateMetadata,
443
456
  prediction_length: int,
444
- random_seed: Optional[int] = None,
457
+ random_seed: int | None = None,
445
458
  shuffle_type: Literal["itemwise", "naive"] = "itemwise",
446
459
  **kwargs,
447
460
  ):
@@ -1,5 +1,4 @@
1
1
  import warnings
2
- from typing import Optional
3
2
 
4
3
  import numpy as np
5
4
  import pandas as pd
@@ -29,7 +28,7 @@ def get_forecast_horizon_index_ts_dataframe(*args, **kwargs) -> pd.MultiIndex:
29
28
  def make_future_data_frame(
30
29
  ts_dataframe: TimeSeriesDataFrame,
31
30
  prediction_length: int,
32
- freq: Optional[str] = None,
31
+ freq: str | None = None,
33
32
  ) -> pd.DataFrame:
34
33
  """For each item in the dataframe, get timestamps for the next `prediction_length` time steps into the future.
35
34
 
@@ -0,0 +1,173 @@
1
+ import time
2
+
3
+ from typing_extensions import Self
4
+
5
+
6
+ class Timer:
7
+ """A timer class that tracks a start time, and computes the time elapsed and
8
+ time remaining, used for handling ``time_limit`` parameters in AutoGluon.
9
+
10
+ Parameters
11
+ ----------
12
+ time_limit
13
+ The time limit to set. If None, then ``time_remaining`` will return None, and
14
+ ``timed_out`` will return False.
15
+
16
+ Examples
17
+ --------
18
+ Basic usage with time limit:
19
+
20
+ >>> timer = Timer(time_limit=10.0).start()
21
+ >>> # Do some work...
22
+ >>> if timer.timed_out():
23
+ ... print("Time limit exceeded!")
24
+ >>> print(f"Time remaining: {timer.time_remaining():.2f}s")
25
+
26
+ Using as a stopwatch (no time limit):
27
+
28
+ >>> timer = Timer(time_limit=None).start()
29
+ >>> # Do some work...
30
+ >>> print(f"Elapsed time: {timer.time_elapsed():.2f}s")
31
+
32
+ Checking time in a loop:
33
+
34
+ >>> timer = Timer(time_limit=5.0).start()
35
+ >>> for i in range(100):
36
+ ... if timer.timed_out():
37
+ ... break
38
+ ... # Do work for iteration i
39
+ """
40
+
41
+ def __init__(
42
+ self,
43
+ time_limit: float | None,
44
+ ):
45
+ self.time_limit = time_limit
46
+
47
+ self.start_time = None
48
+
49
+ def start(self) -> Self:
50
+ """Start or reset the timer."""
51
+ self.start_time = time.monotonic()
52
+ return self
53
+
54
+ def time_elapsed(self) -> float:
55
+ """Total time elapsed since the timer was started. This method can also be used
56
+ when ``time_limit`` is set to None to count time forward (i.e., as opposed to
57
+ a countdown timer which other methods imply).
58
+ """
59
+ if self.start_time is None:
60
+ raise RuntimeError("Timer has not been started")
61
+ return time.monotonic() - self.start_time
62
+
63
+ def time_remaining(self) -> float | None:
64
+ """Total time remaining on the timer. If ``time_limit`` is None,
65
+ this method also returns None.
66
+ """
67
+ if self.start_time is None:
68
+ raise RuntimeError("Timer has not been started")
69
+ if self.time_limit is None:
70
+ return None
71
+ return self.time_limit - (time.monotonic() - self.start_time)
72
+
73
+ def timed_out(self) -> bool:
74
+ """Whether the timer has timed out. If ``time_limit`` is None, this method
75
+ always returns False.
76
+ """
77
+ if self.start_time is None:
78
+ raise RuntimeError("Timer has not been started")
79
+ if self.time_limit is None:
80
+ return False
81
+ return self.time_elapsed() >= self.time_limit
82
+
83
+
84
+ class SplitTimer(Timer):
85
+ """A timer that splits remaining time across multiple rounds.
86
+
87
+ Extends Timer to divide the total time limit across a specified number of rounds,
88
+ useful for allocating time budgets to sequential operations. At each call of
89
+ ``next_round``, the timer re-distributes the remaining time evenly among
90
+ the remaining rounds.
91
+
92
+ Parameters
93
+ ----------
94
+ time_limit
95
+ Total time limit to split across all rounds. If None, ``round_time_remaining``
96
+ returns None.
97
+ rounds
98
+ Number of rounds to split the time across. Default is 1.
99
+
100
+ Examples
101
+ --------
102
+ Split time across 3 rounds:
103
+
104
+ >>> timer = SplitTimer(time_limit=10.0, rounds=3).start()
105
+ >>> time_round_1 = timer.round_time_remaining() # Returns ~3.33
106
+ >>> # Do work for round 1
107
+ >>> timer.next_round()
108
+ >>> time_round_2 = timer.round_time_remaining() # Returns remaining time divided by 2
109
+ >>> # Do work for round 2
110
+ >>> timer.next_round()
111
+ >>> time_round_3 = timer.round_time_remaining() # Returns all remaining time
112
+ """
113
+
114
+ def __init__(
115
+ self,
116
+ time_limit: float | None,
117
+ rounds: int = 1,
118
+ ):
119
+ super().__init__(time_limit)
120
+ self.rounds = rounds
121
+
122
+ self.round_index: int
123
+ self.round_start_time: float
124
+
125
+ def start(self) -> Self:
126
+ """Reset and start the timer."""
127
+ super().start()
128
+ self.round_index = 0
129
+ self.round_start_time = time.monotonic()
130
+ return self
131
+
132
+ def round_time_remaining(self) -> float | None:
133
+ """Get the time budget for the current round.
134
+
135
+ Calculates the time allocation by dividing the remaining time equally among
136
+ the remaining rounds. This means if a previous round used less time than
137
+ allocated, subsequent rounds get more time, and vice versa.
138
+
139
+ Returns time budget for the current round in seconds. Returns None if
140
+ ``time_limit`` is None. Returns 0.0 if all rounds have been exhausted.
141
+ """
142
+ if self.time_limit is None:
143
+ return None
144
+ if self.start_time is None:
145
+ raise RuntimeError("Timer has not been started")
146
+
147
+ remaining_rounds = self.rounds - self.round_index
148
+ if remaining_rounds <= 0:
149
+ return 0.0
150
+
151
+ elapsed_time_at_round_start = self.round_start_time - self.start_time
152
+ remaining_time_at_round_start = self.time_limit - elapsed_time_at_round_start
153
+ round_time_budget = remaining_time_at_round_start / remaining_rounds
154
+
155
+ return round_time_budget - self.round_time_elapsed()
156
+
157
+ def round_time_elapsed(self) -> float:
158
+ """Total time elapsed since the start of this round."""
159
+ if self.start_time is None:
160
+ raise RuntimeError("Timer has not been started")
161
+ return time.monotonic() - self.round_start_time
162
+
163
+ def next_round(self) -> Self:
164
+ """Advance timer to the next round.
165
+
166
+ Increments the round counter, which affects the time allocation returned
167
+ by subsequent ``round_time_remaining`` calls.
168
+ """
169
+ if self.start_time is None:
170
+ raise RuntimeError("Timer has not been started")
171
+ self.round_index += 1
172
+ self.round_start_time = time.monotonic()
173
+ return self
@@ -1,4 +1,4 @@
1
1
  """This is the autogluon version file."""
2
2
 
3
- __version__ = "1.4.1b20251115"
3
+ __version__ = "1.4.1b20251218"
4
4
  __lite__ = False
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: autogluon.timeseries
3
- Version: 1.4.1b20251115
3
+ Version: 1.4.1b20251218
4
4
  Summary: Fast and Accurate ML in 3 Lines of Code
5
5
  Home-page: https://github.com/autogluon/autogluon
6
6
  Author: AutoGluon Community
@@ -23,15 +23,15 @@ Classifier: Operating System :: Microsoft :: Windows
23
23
  Classifier: Operating System :: POSIX
24
24
  Classifier: Operating System :: Unix
25
25
  Classifier: Programming Language :: Python :: 3
26
- Classifier: Programming Language :: Python :: 3.9
27
26
  Classifier: Programming Language :: Python :: 3.10
28
27
  Classifier: Programming Language :: Python :: 3.11
29
28
  Classifier: Programming Language :: Python :: 3.12
29
+ Classifier: Programming Language :: Python :: 3.13
30
30
  Classifier: Topic :: Software Development
31
31
  Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
32
32
  Classifier: Topic :: Scientific/Engineering :: Information Analysis
33
33
  Classifier: Topic :: Scientific/Engineering :: Image Recognition
34
- Requires-Python: >=3.9, <3.13
34
+ Requires-Python: >=3.10, <3.14
35
35
  Description-Content-Type: text/markdown
36
36
  License-File: LICENSE
37
37
  License-File: NOTICE
@@ -39,10 +39,9 @@ Requires-Dist: joblib<1.7,>=1.2
39
39
  Requires-Dist: numpy<2.4.0,>=1.25.0
40
40
  Requires-Dist: scipy<1.17,>=1.5.4
41
41
  Requires-Dist: pandas<2.4.0,>=2.0.0
42
- Requires-Dist: torch<2.8,>=2.6
43
- Requires-Dist: lightning<2.8,>=2.5.1
44
- Requires-Dist: pytorch_lightning
45
- Requires-Dist: transformers[sentencepiece]<4.50,>=4.38.0
42
+ Requires-Dist: torch<2.10,>=2.6
43
+ Requires-Dist: lightning<2.6,>=2.5.1
44
+ Requires-Dist: transformers[sentencepiece]<4.58,>=4.51.0
46
45
  Requires-Dist: accelerate<2.0,>=0.34.0
47
46
  Requires-Dist: gluonts<0.17,>=0.15.0
48
47
  Requires-Dist: networkx<4,>=3.0
@@ -53,23 +52,23 @@ Requires-Dist: coreforecast<0.0.17,>=0.0.12
53
52
  Requires-Dist: fugue>=0.9.0
54
53
  Requires-Dist: tqdm<5,>=4.38
55
54
  Requires-Dist: orjson~=3.9
56
- Requires-Dist: chronos-forecasting<3,>=2.0.1
55
+ Requires-Dist: einops<1,>=0.7
56
+ Requires-Dist: chronos-forecasting<2.4,>=2.2.2
57
+ Requires-Dist: peft<0.18,>=0.13.0
57
58
  Requires-Dist: tensorboard<3,>=2.9
58
- Requires-Dist: autogluon.core[raytune]==1.4.1b20251115
59
- Requires-Dist: autogluon.common==1.4.1b20251115
60
- Requires-Dist: autogluon.features==1.4.1b20251115
61
- Requires-Dist: autogluon.tabular[catboost,lightgbm,xgboost]==1.4.1b20251115
59
+ Requires-Dist: autogluon.core==1.4.1b20251218
60
+ Requires-Dist: autogluon.common==1.4.1b20251218
61
+ Requires-Dist: autogluon.features==1.4.1b20251218
62
+ Requires-Dist: autogluon.tabular[catboost,lightgbm,xgboost]==1.4.1b20251218
62
63
  Provides-Extra: tests
63
64
  Requires-Dist: pytest; extra == "tests"
64
65
  Requires-Dist: ruff>=0.0.285; extra == "tests"
65
66
  Requires-Dist: flaky<4,>=3.7; extra == "tests"
66
67
  Requires-Dist: pytest-timeout<3,>=2.1; extra == "tests"
67
- Provides-Extra: toto
68
- Requires-Dist: einops<1,>=0.7; extra == "toto"
69
- Requires-Dist: rotary-embedding-torch<1,>=0.8; extra == "toto"
68
+ Provides-Extra: ray
69
+ Requires-Dist: autogluon.core[raytune]==1.4.1b20251218; extra == "ray"
70
70
  Provides-Extra: all
71
- Requires-Dist: rotary-embedding-torch<1,>=0.8; extra == "all"
72
- Requires-Dist: einops<1,>=0.7; extra == "all"
71
+ Requires-Dist: autogluon.core[raytune]==1.4.1b20251218; extra == "all"
73
72
  Dynamic: author
74
73
  Dynamic: classifier
75
74
  Dynamic: description
@@ -92,7 +91,7 @@ Dynamic: summary
92
91
 
93
92
  [![Latest Release](https://img.shields.io/github/v/release/autogluon/autogluon)](https://github.com/autogluon/autogluon/releases)
94
93
  [![Conda Forge](https://img.shields.io/conda/vn/conda-forge/autogluon.svg)](https://anaconda.org/conda-forge/autogluon)
95
- [![Python Versions](https://img.shields.io/badge/python-3.9%20%7C%203.10%20%7C%203.11%20%7C%203.12-blue)](https://pypi.org/project/autogluon/)
94
+ [![Python Versions](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12%20%7C%203.13-blue)](https://pypi.org/project/autogluon/)
96
95
  [![Downloads](https://pepy.tech/badge/autogluon/month)](https://pepy.tech/project/autogluon)
97
96
  [![GitHub license](https://img.shields.io/badge/License-Apache_2.0-blue.svg)](./LICENSE)
98
97
  [![Discord](https://img.shields.io/discord/1043248669505368144?color=7289da&label=Discord&logo=discord&logoColor=ffffff)](https://discord.gg/wjUmjqAc2N)
@@ -109,7 +108,7 @@ AutoGluon, developed by AWS AI, automates machine learning tasks enabling you to
109
108
 
110
109
  ## 💾 Installation
111
110
 
112
- AutoGluon is supported on Python 3.9 - 3.12 and is available on Linux, MacOS, and Windows.
111
+ AutoGluon is supported on Python 3.10 - 3.13 and is available on Linux, MacOS, and Windows.
113
112
 
114
113
  You can install AutoGluon with:
115
114
 
@@ -132,8 +131,8 @@ predictions = predictor.predict("test.csv")
132
131
  | AutoGluon Task | Quickstart | API |
133
132
  |:--------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------:|
134
133
  | TabularPredictor | [![Quick Start](https://img.shields.io/static/v1?label=&message=tutorial&color=grey)](https://auto.gluon.ai/stable/tutorials/tabular/tabular-quick-start.html) | [![API](https://img.shields.io/badge/api-reference-blue.svg)](https://auto.gluon.ai/stable/api/autogluon.tabular.TabularPredictor.html) |
135
- | MultiModalPredictor | [![Quick Start](https://img.shields.io/static/v1?label=&message=tutorial&color=grey)](https://auto.gluon.ai/stable/tutorials/multimodal/multimodal_prediction/multimodal-quick-start.html) | [![API](https://img.shields.io/badge/api-reference-blue.svg)](https://auto.gluon.ai/stable/api/autogluon.multimodal.MultiModalPredictor.html) |
136
134
  | TimeSeriesPredictor | [![Quick Start](https://img.shields.io/static/v1?label=&message=tutorial&color=grey)](https://auto.gluon.ai/stable/tutorials/timeseries/forecasting-quick-start.html) | [![API](https://img.shields.io/badge/api-reference-blue.svg)](https://auto.gluon.ai/stable/api/autogluon.timeseries.TimeSeriesPredictor.html) |
135
+ | MultiModalPredictor | [![Quick Start](https://img.shields.io/static/v1?label=&message=tutorial&color=grey)](https://auto.gluon.ai/stable/tutorials/multimodal/multimodal_prediction/multimodal-quick-start.html) | [![API](https://img.shields.io/badge/api-reference-blue.svg)](https://auto.gluon.ai/stable/api/autogluon.multimodal.MultiModalPredictor.html) |
137
136
 
138
137
  ## :mag: Resources
139
138
 
@@ -156,7 +155,10 @@ Below is a curated list of recent tutorials and talks on AutoGluon. A comprehens
156
155
  - [Benchmarking Multimodal AutoML for Tabular Data with Text Fields](https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/file/9bf31c7ff062936a96d3c8bd1f8f2ff3-Paper-round2.pdf) (*NeurIPS*, 2021) ([BibTeX](CITING.md#autogluonmultimodal))
157
156
  - [XTab: Cross-table Pretraining for Tabular Transformers](https://proceedings.mlr.press/v202/zhu23k/zhu23k.pdf) (*ICML*, 2023)
158
157
  - [AutoGluon-TimeSeries: AutoML for Probabilistic Time Series Forecasting](https://arxiv.org/abs/2308.05566) (*AutoML Conf*, 2023) ([BibTeX](CITING.md#autogluontimeseries))
159
- - [TabRepo: A Large Scale Repository of Tabular Model Evaluations and its AutoML Applications](https://arxiv.org/pdf/2311.02971.pdf) (*Under Review*, 2024)
158
+ - [TabRepo: A Large Scale Repository of Tabular Model Evaluations and its AutoML Applications](https://arxiv.org/pdf/2311.02971.pdf) (*AutoML Conf*, 2024)
159
+ - [AutoGluon-Multimodal (AutoMM): Supercharging Multimodal AutoML with Foundation Models](https://arxiv.org/pdf/2404.16233) (*AutoML Conf*, 2024) ([BibTeX](CITING.md#autogluonmultimodal))
160
+ - [Multi-layer Stack Ensembles for Time Series Forecasting](https://arxiv.org/abs/2511.15350) (*AutoML Conf*, 2025) ([BibTeX](CITING.md#autogluontimeseries))
161
+ - [Chronos-2: From Univariate to Universal Forecasting](https://arxiv.org/abs/2510.15821) (*Arxiv*, 2025) ([BibTeX](CITING.md#autogluontimeseries))
160
162
 
161
163
  ### Articles
162
164
  - [AutoGluon-TimeSeries: Every Time Series Forecasting Model In One Library](https://towardsdatascience.com/autogluon-timeseries-every-time-series-forecasting-model-in-one-library-29a3bf6879db) (*Towards Data Science*, Jan 2024)