[SOLVED] attribute selection

gutompfgutompf Member Posts: 21 Contributor II
edited November 2018 in Help
I am doing attribute selection by backward elimination as is describeded in some Rapid Miner tutorials e.g. on youtube. It works fine, but I want to know if is not possible to improve it even more by some work around in my process.
Question: I would like to know if it is not better to have nested some optimization of learner's parameters in this procedure - is to correct to set learner inside (I am using LibSVM) just to some default values (or values which I think could be quite good)?
I mean the optimization of the learner which is nested in Validation (and Validation is nested in Backward elimination).

Thanks,
Milan

Answers

  • MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
    Hi gutompf,

    of course it is valid to optimize the parameters of the learner inside the feature selection. That way you get the most out of your data. Instead of setting some manually chosen "good" parameters you might even consider to place a complete parameter optimization operator inside the feature selection, since the optimal parameters may change for different feature sets. Of course that will drastically increase the running time of your process.

    Btw, you could also try the Forward selection - it is usually faster, since it starts with an empty feature set.

    Best  regards,
    Marius
Sign In or Register to comment.