# Hyperparameter Optimization

This page describes the concept of Hyperparameter Optimization

## HPOs

Hyperparameter Optimization (HPO) optimizes model hyperparameters to ensure models are performing at their best. HPOs run a set of trials where each trial is a full Backtest and optimize hyperparameters for a given model based on the results of the collection of trials run. You can analyze results of HPO runs in our Web Application and client library to confirm your models have the best set of hyperparameters.

With the exception of the Number of Trials, Max Concurrent Trials, and Search Space, HPO shares all parameters with Backtesting. An HPO in the Myst platform therefore has the following parameters:

Parameter | Description |
---|---|

Title | The title of the HPO |

Model | The Model for which a user wants to create an HPO |

Metric | The metric for determining the optimal hyperparameters. Today, MSE is the only available metric. |

Number of Trials | The number of trials in the HPO |

Max Concurrent Trials | The maximum number of trials running at any one time |

Test Start Time | The start time of each trial test period |

Test End Time | The end time of each trial test period |

Fit Start Timing | The start of each Model fit period (Absolute Timing or Relative Timing) |

Fit End Timing | The end of each Model fit period (Absolute Timing or Relative Timing) |

Fit Reference Timing | Schedule or frequency at which Model fits occur (Cron Timing) |

Predict Start Timing | The start of each Model predict period (Absolute Timing or Relative Timing) |

Predict End Timing | The start of each Model predict period (Absolute Timing or Relative Timing) |

Predict Reference Timing | Schedule or frequency at which Model predictions occur (Cron Timing) |

Search Space | The parameters to optimize along with their search space |

## Example

The tables below include the parameters for an HPO with 10 different trials, where each trial runs sequentially. Each trial in the HPO is a full backtest and the parameters are the same as our example in our Backtesting page. See that page for a detailed description of the backtest and its parameters.

### Parameters

Parameter | Value |
---|---|

Number of Trials | `10` |

Max Concurrent Trials | `1` |

Test Start Time | `2021-03-01T00:00:00Z` |

Test End Time | `2021-03-14T00:00:00Z` |

Fit Start Timing | `-P1M` |

Fit End Timing | `PT0H` |

Fit Reference Timing | `0 0 * * 1` |

Predict Start Timing | `PT0H` |

Predict End Timing | `PT24H` |

Predict Reference Timing | `0 0 * * *` |

### Search Space

The parameters below are an example of a search space for an XGBoost model.

Parameter | Sampler | Parameters |
---|---|---|

Boosting Rounds | LogUniform | Lower: `100` Upper: `1000` Base: `10` |

Max Depth | QUniform | Lower: `1` Upper: `12` q: `1` |

Learning Rate | LogUniform | Lower: `0.005` Upper: `0.2` Base: `10` |

Min Child Weight | QUniform | Lower: `0` Upper: `100` q: `5` |

Updated almost 2 years ago