neuralparameter
A neuralparameter is an adjustable element of a neural network that directly determines how inputs are transformed into outputs. In common usage it refers to the model’s learned parameters, most notably weights and biases, which are iteratively updated during training to minimize a loss function. Depending on the architecture, neuralparameters can include additional learned components that influence data flow and representation.
Types of neuralparameters include the per-layer weights that connect neurons, per-neuron biases, and parameters in normalization
Training and optimization involve adjusting neuralparameters to reduce error on labeled data. Backpropagation computes gradients of
Neuralparameters are distinct from hyperparameters, which are set before training (for example, learning rate, batch size,