Skip to main content
Macro by Mark
  • Home
  • News
  • Calendar
  • Indicators
  • Macro
  • About
Sign inSign up
Macro by Mark

Global Economic Data, Empirical Models, and Macro Theory
All in One Workspace

Public data from government agencies and multilateral statistical releases, anchored in official sources

© 2026 Mark Jayson Nation

Product

  • Home
  • Indicators
  • News
  • Calendar

Macro

  • Overview
  • Models
  • Labs
  • Glossary

Learn

  • Concepts
  • Models
  • Schools
  • History
  • Docs

Account

  • Create account
  • Sign in
  • Pricing
  • Contact
AboutPrivacy PolicyTerms of ServiceTrust and securityEthics and Compliance

Data-Driven Models

Loading Data-Driven Models

Macro by Mark

Unlock Full Macro Model Library with Starter.

This feature is exclusively available to Starter, Research, and Pro. Upgrade when you need this workflow, review pricing, or send a question before changing plans.

Upgrade to StarterView pricingQuestions?Already subscribed? Sign in

What you keep on Free

  • Create and edit one custom board
  • Use up to 3 widgets on each Free board
  • Browse indicators and calendar
← ModelsOverviewHistoryConceptsModelsSchools

ETS (exponential smoothing)
Model

Error / trend / seasonality state-space framework that decomposes a series into smoothed components and forecasts forward.

How do you decompose a time series into interpretable level, trend, and seasonal states while simultaneously producing optimal forecasts?

Background

Exponential smoothing dates to the 1950s and 1960s, with foundational contributions from Brown (1959), Holt (1957), and Winters (1960). Brown introduced simple exponential smoothing for inventory demand forecasting at the U.S. Navy; Holt extended it to handle linear trends; Winters added a seasonal component. For decades, these methods existed as ad hoc recursive updating rules without a formal statistical model behind them. The revolution came when Ord, Koehler, and Snyder (1997) and Hyndman, Koehler, Snyder, and Gahber (2002) recast exponential smoothing as a family of state-space models with explicit error distributions, creating what is now called the ETS (Error, Trend, Seasonal) framework. This gave exponential smoothing a proper likelihood function, principled model selection via information criteria, and correct prediction intervals.

The core mechanism is recursive state updating. At each time step, the model observes the new data point, computes a one-step-ahead prediction error (the 'innovation'), and uses that error to update three state components: the level (the current underlying value of the series), the trend (the current rate of change), and the seasonal component (the current seasonal factor for this period within the cycle). The smoothing parameters alpha, beta, and gamma control how much weight the innovation receives in each update -- high values mean the states react quickly to new information, low values mean they change slowly. The key innovation of the ETS framework is that these updating equations are not just rules of thumb; they are the transition equations of a state-space model whose measurement equation links the states to the observed data through either additive or multiplicative composition.

ETS is one of the two dominant univariate forecasting frameworks in applied statistics, alongside ARIMA/SARIMA. In the M3 forecasting competition (2000, 3,003 series), automatic exponential smoothing methods ranked among the top performers, and in the M4 competition (2018, 100,000 series), ETS remained competitive with machine learning approaches on monthly and quarterly macro data. The forecast package for R and the statsforecast library for Python provide fully automatic ETS model selection via AICc, covering all 30 model variants in the ETS taxonomy. Central banks and statistical agencies that use ARIMA as their primary framework often run ETS in parallel as a robustness check -- the ECB's forecasting platform and the Reserve Bank of Australia's MARTIN model both maintain ETS benchmarks alongside ARIMA baselines.

The ETS taxonomy classifies 30 models by three design choices: the error structure (Additive or Multiplicative), the trend structure (None, Additive, Additive-damped, Multiplicative, Multiplicative-damped), and the seasonal structure (None, Additive, Multiplicative). Each combination yields a distinct state-space model with its own likelihood, forecast function, and prediction interval formula. Of these 30, roughly 15 are commonly used in practice; several multiplicative-trend variants are unstable and excluded from automatic selection. The naming convention uses a three-letter code: ETS(A,A,M) means Additive error, Additive trend, Multiplicative seasonality.

How the Parts Fit Together

The input is a single time series observed at a regular frequency. If the series has a seasonal component, the seasonal period s must be declared (not estimated). The series enters the model through a measurement equation that links the observed value y_t to three unobserved state variables: the level l_t, the trend b_t, and the seasonal factor s_t. The measurement equation can combine these states additively (y_t = l_{t-1} + b_{t-1} + s_{t-s} + epsilon_t for ETS(A,A,A)) or multiplicatively (y_t = (l_{t-1} + b_{t-1}) * s_{t-s} * (1 + epsilon_t) for ETS(M,A,M)), and the choice between additive and multiplicative composition is a modeling decision that affects both point forecasts and prediction intervals.

The state transition equations update each component after each observation. For additive error models, the level update is l_t = l_{t-1} + b_{t-1} + alpha * e_t, where e_t is the one-step prediction error and alpha is the level smoothing parameter. The trend update is b_t = b_{t-1} + beta * e_t. The seasonal update is s_t = s_{t-s} + gamma * e_t. Each smoothing parameter controls the balance between responsiveness and stability: alpha near 1 makes the level jump to the latest observation; alpha near 0 makes it barely move. The damped trend variant adds a damping parameter phi that multiplicatively shrinks the trend each period: b_t = phi * b_{t-1} + beta * e_t, causing the trend to attenuate over the forecast horizon.

Estimation maximizes the likelihood implied by the state-space structure. For additive error models, the innovations are Gaussian with constant variance; for multiplicative error models, the innovations are Gaussian with variance proportional to the squared conditional mean. The likelihood is computed from the one-step-ahead prediction error decomposition, identical in structure to the Kalman filter approach used for ARIMA. The initial states (l_0, b_0, and s_{1-s} through s_0) are estimated jointly with the smoothing parameters, either by concentrated MLE or by optimization over the full parameter vector. Model selection across the 30 ETS variants uses AICc, with inadmissible models (those that can produce negative forecasts for inherently positive series) excluded from the candidate set.

Applications

The primary use case is large-scale automatic forecasting where thousands or millions of series must be forecast with minimal human intervention. Amazon, Walmart, and other large retailers use ETS-class models in their demand forecasting pipelines to generate SKU-level predictions that drive inventory, pricing, and staffing decisions. The M5 accuracy competition (2020), which used Walmart point-of-sale data, confirmed that exponential smoothing remains competitive with gradient-boosted tree methods on high-frequency retail data. The appeal is speed and reliability: automatic ETS selects from 30 model variants via AICc, fits in milliseconds per series, and produces both point forecasts and calibrated prediction intervals without any manual tuning.

A secondary application is as a benchmark and robustness check in institutional forecasting. The Reserve Bank of Australia maintains ETS benchmarks for key macro aggregates alongside their MARTIN structural model and ARIMA baselines. The European Central Bank's broad macroeconomic projection exercise includes ETS as one of several univariate cross-checks on the structural model's forecasts. The logic is adversarial: if ETS -- a simple method with no economic theory -- outperforms the structural model, the structural model is adding noise rather than signal.

ETS reaches its limits when the series has complex autocorrelation structure within each seasonal cycle. Because ETS updates its states with a single exponential weight, it cannot capture the kind of within-season serial dependence (e.g., significant autocorrelation at lags 1 through 3 within each quarter) that ARIMA's richer polynomial structure handles naturally. ETS also struggles with multiple seasonal periods: a daily series with both weekly (s=7) and annual (s=365) patterns cannot be modeled by standard ETS. TBATS extends ETS to handle this, but at the cost of interpretability.

The damped trend variant, ETS(A,Ad,N), deserves special mention. Introduced by Gardner and McKenzie (1985), it has proven to be one of the most reliable all-purpose forecasting methods across decades of forecast competitions. The damping parameter phi prevents the trend from extrapolating linearly into the indefinite future -- instead, the forecast flattens toward a horizontal asymptote. For macro series where persistent trends are rare and mean reversion is the norm, the damped trend consistently outperforms both the undamped trend and the no-trend alternatives.

Literature and Extensions

Key Papers

  • Holt (1957) -- Forecasting seasonals and trends by exponentially weighted moving averages (ONR Research Memorandum 52). Introduced trend-corrected exponential smoothing.
  • Winters (1960) -- Forecasting sales by exponentially weighted moving averages (Management Science). Added the seasonal component to create the Holt-Winters method.
  • Hyndman, Koehler, Snyder, and Gahber (2002) -- A state space framework for automatic forecasting using exponential smoothing methods (International Journal of Forecasting). The foundational paper for the ETS taxonomy and automatic selection.
  • Gardner and McKenzie (1985) -- Forecasting trends in time series (Management Science). Introduced the damped trend, which became one of the most successful forecasting modifications ever proposed.
  • Hyndman, Koehler, Ord, and Snyder (2008) -- Forecasting with Exponential Smoothing: The State Space Approach (Springer). The comprehensive monograph covering all 30 ETS models.

Named Variants

  • Simple exponential smoothing (SES) -- ETS(A,N,N): level only, no trend or seasonality
  • Holt's linear method -- ETS(A,A,N): level + additive trend
  • Damped trend -- ETS(A,Ad,N): level + damped additive trend
  • Holt-Winters additive -- ETS(A,A,A): level + trend + additive seasonality
  • Holt-Winters multiplicative -- ETS(M,A,M): multiplicative error and seasonality, additive trend

Open Questions

  • The relationship between ETS and ARIMA is well-characterized for some combinations (IMA(1,1) is equivalent to SES; airline model is approximately equivalent to ETS(M,A,M)) but the full mapping across all 30 ETS models and the ARIMA family remains incomplete for multiplicative error models.
  • Whether multiplicative error models are genuinely better than log-transforming the series and applying additive error models is debated. Log-transformation + additive ETS gives approximately the same point forecast but different prediction intervals.
  • The initial state estimation problem is not fully solved. Current implementations optimize over the initial states jointly with smoothing parameters, but the likelihood surface can be multimodal, especially for multiplicative models with many seasonal states.

Components

ltl_tlt​Level state

The underlying de-trended, de-seasonalized value of the series at time t, updated each period by a fraction alpha of the prediction error.

btb_tbt​Trend state

The current rate of change per period, updated by a fraction beta of the prediction error. Can be additive (linear) or multiplicative (exponential growth rate).

sts_tst​Seasonal state

The seasonal factor for the current period within the seasonal cycle, updated by a fraction gamma of the prediction error. Can be additive (absolute offset) or multiplicative (scaling ratio).

α\alphaαLevel smoothing parameter

Controls how quickly the level state responds to new data. Range [0, 1]. High alpha tracks noise; low alpha produces a smooth level.

β∗\beta^*β∗Trend smoothing parameter

Controls how quickly the trend state responds to new data. Range [0, alpha]. Constrained to be no larger than alpha to prevent the trend from being noisier than the level.

γ\gammaγSeasonal smoothing parameter

Controls how quickly the seasonal state responds to new data. Range [0, 1-alpha]. Constrained to preserve the level-seasonal orthogonality.

ϕ\phiϕTrend damping parameter

Multiplicatively shrinks the trend each period. Range (0, 1]. phi = 1 is undamped; phi < 1 attenuates the trend exponentially over the forecast horizon.

εt\varepsilon_tεt​Innovation

White-noise disturbance. For additive error: epsilon_t ~ N(0, sigma^2). For multiplicative error: epsilon_t ~ N(0, sigma^2) with y_t = mu_t(1 + epsilon_t).

Assumptions

Single source of errorMaintained

Each observation is driven by a single innovation epsilon_t that enters the measurement equation and simultaneously updates all state variables. There is no observation noise separate from the state noise.

If violated: If the true DGP has separate observation and state disturbances (a multiple-source-of-error model), ETS misspecifies the error structure. The point forecast is unaffected but prediction intervals are wrong.

Error distributionTestable

Innovations epsilon_t are iid Gaussian with mean zero. For additive error models, Var(epsilon_t) = sigma^2. For multiplicative error models, Var(epsilon_t) = sigma^2 (relative errors).

If violated: Non-Gaussian innovations do not affect point forecast consistency but invalidate the analytic prediction interval formulas. Bootstrap prediction intervals can substitute.

Constant smoothing parametersMaintained

The smoothing parameters alpha, beta, gamma, and phi are constant over the estimation window. The responsiveness of each state component does not change over time.

If violated: If the optimal smoothing weights change (e.g., more reactive to shocks during recessions), the model averages across regimes and may over-smooth or under-smooth in specific periods.

Stable seasonal periodTestable

The seasonal period s is fixed and correctly specified. The calendar pattern has exactly s periods per cycle with no drift in the cycle length.

If violated: Misspecified s produces systematic forecast errors at the wrong seasonal phase. For series with multiple seasonal periods, standard ETS handles only one; TBATS or multiple seasonal decomposition is needed.

Non-negative series (for multiplicative models)Testable

For ETS models with multiplicative error or multiplicative seasonality, the series must be strictly positive throughout the estimation and forecast windows.

If violated: Multiplicative models can produce negative forecasts when applied to series that cross zero. The automatic ETS selector excludes multiplicative variants when the series contains zeros or negative values.

No exogenous regressorsMaintained

ETS is a pure univariate model. The conditional mean and variance depend only on the series' own past, not on external predictors.

If violated: If strong exogenous predictors exist (e.g., leading indicators for GDP), ETS cannot use them. ARIMAX, transfer function models, or regression with ARIMA errors are needed.