Normalization, also called min-max scaling, refers back to the strategy of rescaling knowledge into a set vary, often [0, 1], or typically [-1, 1]. The purpose of normalization is to make sure that all options are on the identical scale, which is especially essential for machine studying algorithms that depend on distance metrics, similar to k-Nearest Neighbors (k-NN) and k-means clustering.
There are various kinds of normalization:
- Min-Max Normalization (Commonplace Normalization):
πββα΅£β = (π β πβα΅’β) / (πβββ β πβα΅’β)β
Use Case: That is probably the most generally used technique for scaling options to a selected vary. Itβs helpful when it is advisable rescale options, similar to in neural networks or algorithms delicate to the size of enter knowledge.
2. Max Abs Normalization:
πββα΅£β = (π / (πβββ )β
- Use Case: Helpful while you wish to protect the signal (constructive or unfavourable) of the information whereas scaling the values between [-1, 1].
3. Decimal Scaling:
Decimal scaling normalizes by shifting the decimal level of values of the function.
- Use Case: Much less frequent however might be helpful in instances the place you wish to scale the information with out particular bounds.