Standardization VS Normalization

Zaid Alissa Almaliki
2 min readJul 27, 2018
Feature Scaling via StackExchange

Standardization

Standardization (or Z-score normalization) is the process of rescaling the features so that they’ll have the properties of a Gaussian distribution with

μ=0 and σ=1

where μ is the mean and σ is the standard deviation from the mean; standard scores (also called z scores) of the samples are calculated as follows:

Normalization

Normalization often also simply called Min-Max scaling basically shrinks the range of the data such that the range is fixed between 0 and 1 (or -1 to 1 if there are negative values). It works better for cases in which the standardization might not work so well. If the distribution is not Gaussian or the standard deviation is very small, the min-max scaler works better.

Normalization is typically done via the following equation:

Min-Max Scaling image via Chris Albon

Use Cases

Some examples of algorithms where feature scaling is important are:

  • K-nearest neighbors with a Euclidean distance measure if want all features to contribute equally.
  • Logistic regression, SVM…

--

--

Zaid Alissa Almaliki

Founder, Principal Data Engineer and Cloud Architect Consultant in DataAkkadian.