The Altair Community is migrating to a new platform to provide a better experience for you. The RapidMiner Community will merge with the Altair Community at the same time. In preparation for the migration, both communities are on read-only mode from July 15th - July 24th, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here.

"Comparing multiple methods with same dataset in the Wrapper Feature Selection"

SaloSalo Member Posts: 6 Contributor II
edited June 2019 in Help
As part of a project of mine I am trying to compare 4 methods (NB,RandomForest,SVM, MLP).  In the process it reaches a step where the data set is multiplied into 4 distinct wrappers (standard Optimize selection) in which the feature selection will take place. In the inner learner of the wrapper I am using the x-validation to obtain the performance of each learner and then obtain the feature set. However I want to be sure that datasets created by the cross-validation (in this case 10) are exactly the same for consistency sake between all the methods, how would I do that?

So summing up, how do I guarantee for each wrapper I have the same rows within folds each time.


  • Options
    frasfras Member Posts: 93 Contributor II
    Activate "Use local random seed" in the X-Validation operator
Sign In or Register to comment.