Fehlerrückführung
Fehlerrückführung, often translated as error feedback or error backpropagation, is a fundamental concept in the training of artificial neural networks. It describes the process by which the error or loss incurred by a neural network's prediction is propagated backward through the network's layers. This backward propagation allows the network to adjust its internal parameters, specifically the weights and biases, to minimize future errors.
The process begins with a forward pass, where input data is fed through the network to produce
Once the gradients are computed, an optimization algorithm, most commonly gradient descent or one of its variants,