[SOLVED] Optimizing Parameters over Serveral ExampleSets
I am stuck with a problem and I hope I can get some ideas here on how to handle this in rapidminer.
I have 100 examples of binary classification tasks that I want to evaluate some algorithms on. I figured out how to build a process that performs validation and measures the performance.
Now I would like to find the best parameters for some of the algorithms, for example SVM. I saw that there is the Grid Optimization Operator that can help me figure out the best parameters for the learner with a given input.
However I don't see how it can help me optimize the parameter across my 100 examples instead of just one example. I can give in serveral ExampleSets through the input port and run multiple validations inside, but the Operator only offers one output port for performance. I could also perform a loop for every exampleSet but I haven't found a way to combine multiple performance vectors.
Can anyone kindly direct me into the right direction?
Thanks in advance,