The Model Arena
There are many machine learning and statistical forecasting models for a reason. Some forecasting models are great at capturing multiple seasonal fluctuations, while other models excel at predicting underlying trends. Even within the same business, data patterns can vary wildly per region, per product, per service, etc. Why would you want a one-size-fits-all approach to your forecasting process across a variety of different line items (products, regions, services, plants, etc. intersections)?
That is why SensibleAI Forecast has the Model Arena. The SensibleAI Forecast Model Arena is a proprietary modeling technique that is purpose-built to squeeze every ounce of forecast accuracy out of the line items to be forecasted in your business planning processes.
At its core, the Model Arena is a competitive model training environment where multiple machine learning and statistical models compete against each other for a chance to be elected as the “champion” or the best performer for each of the respective line items to be forecasted. This process is grounded in a rigorous training strategy that ensures models are not only accurate but robust across various time periods to ensure that the models do not overfit the training data as business conditions change. By the Model Arena leveraging a diverse set of models, SensibleAI Forecast can choose the right model for the job based on the unique data patterns that exist for each line item to be forecasted to make sure you are not leaving any accuracy on the table.
Please see the table below for the list of forecast models that exist within the Model Arena.
Model Name | Category | Description | Interpretability | Supports Multiseries |
|---|---|---|---|---|
XGBoost | ML | Extreme Gradient Boosting; known for its performance and speed, excellent for handling large datasets and complex patterns. | Feature Impact Prediction Explanations | |
Catboost | ML | Gradient boosting with categorical features support; excels in handling categorical data and preventing overfitting. | Feature Impact Prediction Explanations | |
LGBM | ML | LightGBM, a gradient boosting framework; highly efficient and fast, suitable for large datasets with many features. | Feature Impact Prediction Explanations | |
Cubist | ML | Rule-based regression model; effective for complex relationships and handling both numeric and categorical predictors. | Feature Impact Prediction Explanations | |
Random Forest | ML | Ensemble of decision trees; robust and effective for capturing non-linear relationships and interactions. | Feature Impact Prediction Explanations | |
Elastic Net | ML | Combines L1 and L2 regularization; useful for feature selection and handling multicollinearity in large datasets. | Feature Impact Prediction Explanations | |
Poly Elastic Net | ML | Combines polynomial features with Elastic Net; useful for capturing non-linear relationships. | Feature Impact Prediction Explanations | |
SVR | ML | Support Vector Regression; handles non-linear relationships well, effective for small to medium-sized datasets. | Feature Impact Prediction Explanations | |
Tweedie GLM | ML | Generalized Linear Model with Tweedie distribution; good for modeling data with both continuous and discrete components. | Feature Impact Prediction Explanations | |
Arima | Statistical | Autoregressive Integrated Moving Average; powerful for series with trends and seasonality when properly tuned. | Feature Impact Prediction Explanations | |
Croston | Statistical | Specifically designed for intermittent demand forecasting; good for sparse data with periods of zero demand. | Feature Impact Prediction Explanations | |
ETS Simple | Statistical | Exponential Smoothing; best for capturing level and trend components in time series. | Feature Impact Prediction Explanations | |
Exponential Smoothing | Statistical | Weighs past observations with exponentially decreasing weights; useful for smoothing and forecasting. | Feature Impact Prediction Explanations | |
Fourier | Statistical | Decomposes time series into sinusoidal components; effective for capturing periodic patterns. | Feature Impact Prediction Explanations | |
Holt Linear | Statistical | A specific case of Holt’s method with linear trend; best for series with linear trends. | Feature Impact Prediction Explanations | |
Holt Winter Additive | Statistical | Adds seasonal components to Holt's method; good for series with additive seasonality. | Feature Impact Prediction Explanations | |
Holt Winter Multiplicative | Statistical | Adds seasonal components multiplicatively; best for series with multiplicative seasonality. | Feature Impact Prediction Explanations | |
Seasonal Additive | Statistical | Models seasonality additively; suitable for series with additive seasonal patterns. | Feature Impact Prediction Explanations | |
Seasonal Multiplicative | Statistical | Models seasonality multiplicatively; effective for series with multiplicative seasonal patterns. | Feature Impact Prediction Explanations | |
Simple Exponential Smoothing | Statistical | Single-parameter smoothing; best for level data without trend or seasonality. | Feature Impact Prediction Explanations | |
Simple Moving Average | Statistical | Averages a fixed number of past observations; useful for smoothing out short-term fluctuations. | Feature Impact Prediction Explanations | |
Theta | Statistical | Combines decomposed components of time series; known for its simplicity and effectiveness. | Feature Impact Prediction Explanations | |
Mean | Baseline | Averages all past values to forecast future values; best for stationary series without trend or seasonality. | Feature Impact Prediction Explanations | |
Shift | Baseline | Uses lagged values as predictors; good for capturing autocorrelation in the data. | Feature Impact Prediction Explanations | |
Last Value Naïve | Baseline | Uses the last observed value as the forecast; simple and effective for random walks or very short-term forecasts. | Feature Impact Prediction Explanations |