Options

Normalization without Group Model

JerwuneyJerwuney Member Posts: 19 Contributor II
Hi, 

I used normalization in my regression analysis and realized the following:

1. If I apply Group Models parameter, the result is almost similar to when I build the model without normalization.

2. If I remove the Group Models parameter and use only the Normalization parameter, I get better predictions (more predicted values are closed to the actual values). But squared correlation becomes zero and RMSE increases.

I used Decision Tree Regression.

Please does it mean that Normalization should always be tagged with Group Models?

Was the initial results merely a memorization?

Thank you.
Jerwuney

Best Answers

  • Options
    BalazsBaranyBalazsBarany Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert Posts: 955 Unicorn
    Solution Accepted
    Hi!

    Normalization only helps with algorithms that compare or calculate multiple attributes with different numeric ranges. Example: Age (in years) and income (in dollars) are very different numeric ranges. This would make k-NN or Principal Component Analysis heavily influenced by income and not by the age.

    You don't need normalization with Decision Tree regression. If squared correlation becomes zero, you did something wrong. 

    If you do a normalization on the training side but not on the testing side, you are building models on data with different properties -- this won't improve your models. 

    Group Models is helpful in the situations where you would like to make sure that the normalization or other preprocessing operation is applied in the same way on both the training and the testing data, and then is built into the production model. 

    Regards,
    Balázs
  • Options
    JerwuneyJerwuney Member Posts: 19 Contributor II
    Solution Accepted
    Hi @BalazsBarany

    Thanks. This helps. 

    Regards,
    Jerwuney
Sign In or Register to comment.