Difference OptimizeAttributes vs. OptimizeWeights


Hi,
i want to optimize the Attributeset for a neural net.
My question is simple: Why should one choose the OptimizeAttributes-Operator over the OptimizeWeights-Operator? The OptimizeWeights-Operator does also return a subset of usefull Attributes, because it sets not useful attribute's weights to 0. So both operators leave you with a subset of attributes, but the WeightOptimizer additionally optimizes the weights. Is there a usecase, where this is not favorable?
Thank you very much,
Julian
i want to optimize the Attributeset for a neural net.
My question is simple: Why should one choose the OptimizeAttributes-Operator over the OptimizeWeights-Operator? The OptimizeWeights-Operator does also return a subset of usefull Attributes, because it sets not useful attribute's weights to 0. So both operators leave you with a subset of attributes, but the WeightOptimizer additionally optimizes the weights. Is there a usecase, where this is not favorable?
Thank you very much,
Julian
0
Answers
I cannot find an OptimizeAttributes operator in my RapidMiner.
Best,
Simon
Anyhow, i think my question falls into the category "dumb beginner question", and therefore is obsolete.
I wouldn't say this is a "dumb" question :-) If I understand you correctly, your problems are solved, right? You can also have a look at the sample processes, I think the operators are used there as well.
Best,
Simon