Neural net and normalization

GonzaloADGonzaloAD Member Posts: 4 Contributor I
edited December 2018 in Help

Hello community members,

I have a doubt about the normalization of the neural operator net.

 

According to the guide, the normalization of the data can be carried out automatically with the normalize option  "This is an expert parameter. The Neural Net operator uses an usual sigmoid function as the activation function. Therefore, the value range of the attributes should be scaled to -1 and +1. This can be done through the normalize parameter. Normalization is performed before learning. Although it increases runtime but it is necessary in most cases."

 

My question is: Why is it necessary to scale the values between -1 and +1? Can we choose to scale them between 0 and 1 or another normalization type?

 

Thank you very much for you answers

Best Answer

  • rfuentealbarfuentealba Moderator, RapidMiner Certified Analyst, Member, University Professor Posts: 568 Unicorn
    Solution Accepted

    Hi @GonzaloAD,

     

    An Artificial Neural Network (ANN) is a collection of units of information, arranged in series of layers. You have three types of units: input units, hidden units and output units. Each connection between two units is called a weight, and that weight can be positive or negative, depending on if one unit "excites" or "inhibits" another one.

     

    If you use different scales for your variables, your ANN algorithm will probably not notice a correlation between these variables. Normalizing your inputs prevents such differences to affect the weights. If you want to define your upper and lower values in +32768 and -32767 you can, but you have to use the "Normalize" operator before you operate on your ANN.

    Hope this helps.

     

Answers

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,503 RM Data Scientist

    Hi,

    it's i think not necessary but it can make your model better.  I recommend to use an explicit normalize operator up front. This way you can also change your normalization scheme.

     

    Best,

    Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • GonzaloADGonzaloAD Member Posts: 4 Contributor I

    Perfect,


    as I supposed, it's just the default option of the operator. As I had seen, another normalization can be used, always using previously the operator to normalize.

     

    Thank you very much for your answer

Sign In or Register to comment.