preactivations
Preactivations refer to the input values passed to an activation function within a neural network. Before an activation function is applied to a neuron's weighted sum of inputs and bias, the result of this calculation is the preactivation. This value is crucial as it directly influences the output of the activation function and, consequently, the neuron's contribution to the network's overall processing.
Different activation functions, such as ReLU, sigmoid, or tanh, operate on these preactivation values. For instance,