The NGBoost model provides Stanford ML Group's NGBoost. NGBoost enables predictive uncertainty estimation with Gradient Boosting through probabilistic predictions (including real valued outputs). With the use of Natural Gradients, NGBoost overcomes technical challenges that make generic probabilistic prediction hard with gradient boosting.
Myst's implementation of NGBoost contains four probability distributions - Normal, Cauchy, Laplace, and Lognormal. Each model outputs scale and loc parameters that can be used to determine the upper and lower bounds of the associated probability distribution. The loc parameter in the lognormal is returned in log space.
In backtests MAPE, MSE, and MAE are reported for symmetric distributions (Normal, Cauchy, Laplace). For asymmetric distributions (Lognormal), only mean-negative-log-likelihood.
|The assumed distributional form of the endogenous data, given the exogenous data.||Normal|
|The number of boosting iterations to fit.||500|
|The learning rate||0.01|
|The maximum depth of the base learner.||3|
|The percent subsample of rows to use in each boosting iteration.||1|
|The percent subsample of columns to use in each boosting iteration.||1|
See our Create a Price Forecast with Probabilistic Prediction tutorial for examples using NGBoost!
Updated 5 months ago