Hyperparameters are settings or parameters that are chosen before training a machine learning model to adjust or control its learning process and improve its performance and accuracy. Some examples of Hyperparameters include the number of layers in a neural network, the number of neurons in each layer, the learning rate, etc. The Hyperparameters are usually determined through a process called hyperparameter tuning, where different combinations of hyperparameters are tested to find the optimal combination that results in the best model performance.
For example, we may have a Hyperparameter called the “learning rate” which determines how fast the algorithm learns from the data. If the learning rate is too high, the algorithm may overshoot the optimal solution and not perform well on the test data. On the other hand, if the learning rate is too low, the algorithm may take too long to learn and also not perform well.