Information gain and numerical attributes
how does RapidMiner handle numerical attributes and information gain calculation for feature seletion? Is every occuring value used or does RM calculate several "bins"?
Answer by ****:
do you refer to the InfoGainWeighting operator or the information gain calculation inside of a decision tree learner?
> Is every occuring value used or does RM calculate several "bins"?
Both is possible. If you discretize the values first with one of the discretization operators, these bins are used. If not, RM tries all possible split points.
Answer by topic starter:
I was refering to the InfoGainWeighting operator which is used for feature selection.