Options

Decission Trees and Attirbute Set Selection

chaosbringerchaosbringer Member Posts: 21 Contributor II
edited November 2018 in Help
Hi,
i have a probably stupid question.
Is it reasonable to use the Attribute Selection Operator (e.g. Evolutionary Optimization) with decission trees?
I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
If i am, would it not make sense to use a decission tree for attribute selection and than use the attributes used at the tree nodes for e.g. neural net learning?

Thank you very much.

Answers

  • Options
    haddockhaddock Member Posts: 849 Maven
    Hi there,
    I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
    In a word, no! What happens is that the most entropy reducing attribute gets used, wherever you are in the tree. The converse is that the useless stuff remains to be used later. In theory this means that you can make decisions without having to ask silly questions, and in practise it means that you can overcome the noise that junk attributes generate. The flip-side is that you can end up carrying a load of junk through the process, it won't affect the result, but it slows things down.

    This latter consideration might provide a motivation for reducing the attribute set by pre-processing, but you have to be careful not to introduce junk in return, like user assumptions about what matters, even indirectly through parameter bias.

    O for a lump of perfect Green...

  • Options
    chaosbringerchaosbringer Member Posts: 21 Contributor II
    Thank you for your answer.

Sign In or Register to comment.