πŸŽ‰ πŸŽ‰. RAPIDMINER 9.8 IS OUT!!! πŸŽ‰ πŸŽ‰

RapidMiner 9.8 continues to innovate in data science collaboration, connectivity and governance

CLICK HERE TO DOWNLOAD

"Comparing multiple methods with same dataset in the Wrapper Feature Selection"

SaloSalo Member Posts: 6 Contributor I
edited June 2019 in Help
As part of a project of mine I am trying to compare 4 methods (NB,RandomForest,SVM, MLP). Β In the process it reaches a step where the data set is multiplied into 4 distinct wrappers (standard Optimize selection) in which the feature selection will take place. In the inner learner of the wrapper I am using the x-validation to obtain the performance of each learner and then obtain the feature set. However I want to be sure that datasets created by the cross-validation (in this case 10) are exactly the same for consistency sake between all the methods, how would I do that?


So summing up, how do I guarantee for each wrapper I have the same rows within folds each time.

Answers

  • frasfras Member Posts: 93 Contributor II
    Activate "Use local random seed" in the X-Validation operator
Sign In or Register to comment.