Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.
"Neural Net Parameter Optimization ERROR"
Hi
i have read the topics mentioning the error I get quite often:
"Handle Exception: Error occurred and will be neglected by Handle Exception: Cannot reset network to a smaller learning rate."
and I do not think it is really due to the small learning rate(LR).
1) I start optimizing at 0.5 (that is NOT small).
2) this error reoccurs even when much higher LR is used >1
3) After the error occurs for the first time it seems to keep happening even for parameter combinations that are VERY close to those that have been tested without any error before the first error occurrence.
I tried to circumvent this by wrapping the Neural net operator (being optimized) into the exception operator.
In the CATCH part I load (by retrieve operator) a 'error free model' originating from the same data and hook it to the output.
But even though after the first error occurrence things seems to get slower and the error keeps popping for pretty much any possible LR value (far from zero).
This happens on either grid or evolutionary optimization operators.
What do do?
Is the catch exception solution I am using ok(loading a valid model in catch section)?
Should the Neural net operator be somehow(?) restarted after the error happens for the first time?
???
thx
f
i have read the topics mentioning the error I get quite often:
"Handle Exception: Error occurred and will be neglected by Handle Exception: Cannot reset network to a smaller learning rate."
and I do not think it is really due to the small learning rate(LR).
1) I start optimizing at 0.5 (that is NOT small).
2) this error reoccurs even when much higher LR is used >1
3) After the error occurs for the first time it seems to keep happening even for parameter combinations that are VERY close to those that have been tested without any error before the first error occurrence.
I tried to circumvent this by wrapping the Neural net operator (being optimized) into the exception operator.
In the CATCH part I load (by retrieve operator) a 'error free model' originating from the same data and hook it to the output.
But even though after the first error occurrence things seems to get slower and the error keeps popping for pretty much any possible LR value (far from zero).
This happens on either grid or evolutionary optimization operators.
What do do?
Is the catch exception solution I am using ok(loading a valid model in catch section)?
Should the Neural net operator be somehow(?) restarted after the error happens for the first time?
???
thx
f
Tagged:
0
Answers
does your data set contain missing values? As I can see from the comments in our bugtracker this happens if missing values are present which can be avoided by using the "Replace Missing values" operator before trying to train a Neural Net.
Best,
Nils
there are NO missing data in my dataset.
f