Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.
"Neural Net normalization"
chaosbringer
Member Posts: 21 Contributor II
Hi,
i am making a regression with a neural net. My label values have values between 0 and 1.
If i train a neural net and apply the model on some data, the prediction ranges between -1 and 1, also i have the normalization-option of the neural net operator activated.
Values smaller 0 are not valid for my mode, because 0 represents already the lowest acceptable model.
Is it ok to "cut" all values below 0 to 0?
Would it not be benfical, if this could be done somehow already during the training process, because the cut would affect the performance measurement during validation and thus affect the trained model.
I hope i could make my point clear... Thank you very much
i am making a regression with a neural net. My label values have values between 0 and 1.
If i train a neural net and apply the model on some data, the prediction ranges between -1 and 1, also i have the normalization-option of the neural net operator activated.
Values smaller 0 are not valid for my mode, because 0 represents already the lowest acceptable model.
Is it ok to "cut" all values below 0 to 0?
Would it not be benfical, if this could be done somehow already during the training process, because the cut would affect the performance measurement during validation and thus affect the trained model.
I hope i could make my point clear... Thank you very much
Tagged:
0
Answers
Denis