Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.
Decission Trees and Attirbute Set Selection
chaosbringer
Member Posts: 21 Contributor II
Hi,
i have a probably stupid question.
Is it reasonable to use the Attribute Selection Operator (e.g. Evolutionary Optimization) with decission trees?
I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
If i am, would it not make sense to use a decission tree for attribute selection and than use the attributes used at the tree nodes for e.g. neural net learning?
Thank you very much.
i have a probably stupid question.
Is it reasonable to use the Attribute Selection Operator (e.g. Evolutionary Optimization) with decission trees?
I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
If i am, would it not make sense to use a decission tree for attribute selection and than use the attributes used at the tree nodes for e.g. neural net learning?
Thank you very much.
0
Answers
This latter consideration might provide a motivation for reducing the attribute set by pre-processing, but you have to be careful not to introduce junk in return, like user assumptions about what matters, even indirectly through parameter bias.
O for a lump of perfect Green...