normaalisaatio
Normaalisaatio, in English normalization, is a general term for processes that adjust data, signals, or models to a common scale or standard. The goal is to enable meaningful comparisons, stabilize numerical computations, or satisfy model assumptions across diverse applications. Normaalisaatio is used across statistics, machine learning, signal processing, imaging, and probability theory, and its exact form depends on the context.
In data preprocessing and machine learning, common methods include:
- Min‑max normalization, which rescales features to a fixed range, typically [0, 1] or [-1, 1].
- Z‑score standardization, which centers data around zero and scales by the standard deviation, resulting in features
- L2 (unit-length) normalization of vectors, often used in text and directional data.
- Robust scaling, which uses quantiles to mitigate the influence of outliers.
- Transformations such as logarithmic or Box‑Cox to make distributions more normal-like before modeling.
In probability and statistics, normalization refers to adjusting a function so that its total mass or integral
Applications of normaalisaatio include improving numerical stability, accelerating convergence of optimization algorithms, and enhancing comparability between