[SOLVED] Creating ensemble methods (bagging)

laplantolaplanto Member Posts: 3 Contributor I
edited November 2018 in Help
Hi, I am trying to create a hybrid classificator using two different classification algorithms. I want to use bagging, so idea would be to split dataset into multiple datasets, classifie with one kin of classifiers and classify results with another classifier.
I have dataset. I use bagging control in rapidMiner and K-NN inside it. I manage to get classification results from each of k-nn. How can i collect those results and give them as a dataset to the next classifier (probably random forest) ?


  • Options
    MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn

    if I understand you correctly, what you want to do is commonly referred to as "stacking". We have a corresponding Stacking operator in RapidMiner. Please have a look at its documentation to see if it fits your needs.

    Best regards,
  • Options
    laplantolaplanto Member Posts: 3 Contributor I
    If i understand well, stacking is exactly what i need. But i have problems using it.
    First of all, i need to test this hybrid classificator using 10, 100, 200 K-NN in the first level and i don't know how to do it. I think i need to use split and bagging operators.
    Secondly i need to use svm in the second level, but i get error "SVM does not have sufficient capabilities for the given data: binomial attributes not supported".
  • Options
    laplantolaplanto Member Posts: 3 Contributor I
    Ok, i managed to do this:

    The only problem is, that with SVM it does not work. I get error "The operator SVM does not have sufficient capabilities for the given adta set: binomial attributes not supported".
    If i change SVM with K-NN it works perfectly. But i need SVM to be there.


    I found a workaround to my problem. I changed K-NN and SVM places and it works ok. Of course, for future works, it would be nice to know how to solve this problem.
  • Options
    KellyMKellyM Member Posts: 21 Maven

    This may be an old thread but in case anyone else stumbles upon it like I did, I thought I should answer what I can.


    The k-NN model creates a new binomial prediction variable that is appended to your dataset.  SVM cannot handle binominal attributes and that is why it throws that error.  The algorithms placed in the Base Learner window (left side) of the Stacking operator will always create this new binomial attribute so the algorithm in the Stacking Model Learner window (right) has to have binomial attribute capabilities.

  • Options
    M_MartinM_Martin RapidMiner Certified Analyst, Member Posts: 125 Unicorn

    Hi KellyM: Thanks so much for your post!.


    Best wishes, Michael MArtin

Sign In or Register to comment.