Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.

"Neural Net normalization"

chaosbringerchaosbringer Member Posts: 21 Contributor II
edited May 2019 in Help
Hi,
i am making a regression with a neural net. My label values have values between 0 and 1.
If i train a neural net and apply the model on some data, the prediction ranges between -1 and 1, also i have the normalization-option of the neural net operator activated.
Values smaller 0 are not valid for my mode, because 0 represents already the lowest acceptable model.
Is it ok to "cut" all values below 0 to 0?
Would it not be benfical, if this could be done somehow already during the training process, because the cut would affect the performance measurement during validation and thus affect the trained model.

I hope i could make my point clear... Thank you very much

Answers

  • drdespairdrdespair Member Posts: 7 Contributor II
    I would like to bump this as I have a similar question about defining the label range. I am working on creating an index from 0 to 100, based on samples in that range, but when I run it through the modelling I can get negative values as well. How can I ensure that the model is built for a label in my indicated range? Thank you!

    Denis
Sign In or Register to comment.