logits
Logits are the raw, unnormalized scores produced by a classifier before a normalization step such as softmax. In binary classification, a single logit z corresponds to the log-odds of the positive class: p = sigmoid(z) = 1/(1+exp(-z)); equivalently z = log(p/(1-p)). In multi-class classification with K classes, the model outputs a vector z in R^K of logits. The predicted class probability is obtained by applying the softmax function: p_i = exp(z_i) / sum_j exp(z_j). The logits therefore encode relative evidence for each class; only their relative values matter for the final decision, while probabilities are obtained after normalization.
From a training perspective, using logits directly with cross-entropy loss is common: the loss computes the
Important distinctions include that logits are not probabilities themselves; they are real-valued scores. They can be