HPO and binary classification are live ⚡️
over 2 years ago by Ellery Berk
Myst Platform Release (2022-07-07)
Hello Myst Platform users,
We are excited to release Hyperparameter Optimization (HPO) this week!
Please update your myst-alpha
package at your earliest convenience to ensure you have access to this week’s updates (instructions here).
✨ New feature:
- HPO is live! HPO allows you to optimize model parameters. Please see our documentation explaining HPO concepts and how to create and run an HPO to learn more. The HPO functionality is currently limited to XGBoost regression models; we will add more advanced HPO functionality in the coming releases. We welcome your feedback on this MVP release and your inputs to prioritize further HPO feature development.
⚡️ Enhancements:
- XGBoost and LightGBM models now support binary classification. To create an XGBoost model with classification, select the
binary:logistic
objective function in the UI or add parameterobjective=XGBoostObjective.BINARY_LOGISTIC
in the client library when creating the model connector. To create a LightGBM model, select thebinary
objective function in the UI or add parameterobjective=LightGBMObjective.BINARY
in the client library when creating the model connector. Make sure your target is a binary variable. These connectors will return probabilities of the positive class. Backtests will return the classification metrics log loss, precision, and recall.
Thank you!
Charlie and the Myst team