Decission Trees and Attirbute Set Selection


Hi,
i have a probably stupid question.
Is it reasonable to use the Attribute Selection Operator (e.g. Evolutionary Optimization) with decission trees?
I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
If i am, would it not make sense to use a decission tree for attribute selection and than use the attributes used at the tree nodes for e.g. neural net learning?
Thank you very much.
i have a probably stupid question.
Is it reasonable to use the Attribute Selection Operator (e.g. Evolutionary Optimization) with decission trees?
I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
If i am, would it not make sense to use a decission tree for attribute selection and than use the attributes used at the tree nodes for e.g. neural net learning?
Thank you very much.
0
Answers
This latter consideration might provide a motivation for reducing the attribute set by pre-processing, but you have to be careful not to introduce junk in return, like user assumptions about what matters, even indirectly through parameter bias.
O for a lump of perfect Green...