Softmaxzi
Softmaxzi is a probabilistic normalization technique described as an extension of the softmax activation used in machine learning. It modifies the standard softmax by applying z-score normalization to the input vector before computing class probabilities, yielding p_i = exp(z_i)/sum_j exp(z_j) with z_i = (x_i - mu)/sigma, where mu is the mean of the components x_i and sigma is their standard deviation (computed per sample).
This pre-normalization makes the distribution invariant to additive shifts and can improve numerical stability and calibration
Variants may compute mu and sigma over different scopes, such as per-sample, per-batch, or using running estimates
Applications include neural networks for multi-class classification, calibration of predicted probabilities, and scenarios with skewed or
Softmaxzi is a relatively new or niche concept with limited widespread adoption. Its effectiveness appears to
See also: Softmax, z-score normalization, batch normalization, probability calibration, temperature scaling.