zurückpropagiert
Zurückpropagiert, often translated as backpropagation, is a fundamental algorithm in the training of artificial neural networks. It is the primary method used to adjust the weights of a neural network in response to the error it makes during training. The process begins after the network has made a prediction for a given input. The difference between this prediction and the actual correct output, known as the error, is calculated.
This error is then propagated backward through the network, layer by layer. For each neuron, the algorithm
Once these gradients are computed, the network's weights and biases are adjusted in the opposite direction