HyperparameterOptimierungsmethoden
Hyperparameters are configuration settings used to control the training of machine learning models. They are not learned from the data during training; instead, they are set before learning begins and determine how the algorithm searches the parameter space and how quickly it converges.
Examples include learning rate, batch size, number of layers and units, activation functions, regularization strength, dropout
Hyperparameter tuning is the process of selecting values that maximize generalization performance on a validation set.
Good practices include starting from sensible defaults, using smaller experiments to narrow the search, and reporting