Does Random Forest Need Feature Scaling or Normalization?

If you are using Random Forest as your machine learning model, you don’t need to worry about scaling or normalizing your features. Random Forest is a tree-based model and hence does not require feature scaling. Tree-based models are invariant to the scale of the features, which makes them very user-friendly as this step can be skipped during preprocessing. Still, in practice you can see different results when you scale your features because of the way numerical values are represented in computers....

June 29, 2023 · 5 min · Mario Filho

Do Neural Networks Need Feature Scaling Or Normalization?

In short, feature scaling or normalization is not strictly required for neural networks, but it is highly recommended. Scaling or normalizing the input features can be the difference between a neural network that converges in a few iterations and one that takes hundreds of iterations to converge or even fails to converge at all. The optimization process may become slower because the gradients in the direction of the larger-scale features will be significantly larger than the gradients in the direction of the smaller-scale features....

April 4, 2023 · 8 min · Mario Filho

Does SVM Need Feature Scaling Or Normalization?

In Support Vector Machines (SVM), feature scaling or normalization are not strictly required, but are highly recommended, as it can significantly improve model performance and convergence speed. SVM tries to find the optimal hyperplane that separates the data points of different classes with the maximum margin. If the features are on different scales, the hyperplane will be heavily influenced by the features with larger values, potentially leading to suboptimal results....

April 1, 2023 · 5 min · Mario Filho

Do Decision Trees Need Feature Scaling Or Normalization?

In general, no. Decision trees are not sensitive to feature scaling because their splits don’t change with any monotonic transformation. Normalization is not necessary either, but it can change your results because it’s not monotonic, as we’ll see later. That said, the numerical implementation of a specific library may make your decision tree predictions change if you don’t scale or normalize your data. This is usually a very small change, that you don’t need to worry about, but it’s good to know if you find yourself in a situation where you need to explain why your predictions are different....

March 24, 2023 · 5 min · Mario Filho

Does XGBoost Need Feature Scaling Or Normalization?

If you are using XGBoost with decision trees as your base model, you don’t need to worry about scaling or normalizing your features. Decision trees are not sensitive to the scale of the features. In practice, I have seen very minor differences in score by scaling[features for decision trees, but these are due to numerical computing implementations and not significant in practice. If you are using XGBoost with linear models as base models, it is a good idea to scale or normalize the features....

December 30, 2022 · 7 min · Mario Filho