Options

Stacking: probabilities instead of label?

spitfire_chspitfire_ch Member Posts: 38 Maven
edited November 2018 in Help
Hi,

I was experimenting with stacking and noticed that the only output the base learners provide to the "stacking model learner" is the final label. Don't most base learners contain more information than just the final label? More specifically, couldn't one pass probabilities instead of the final label?

E.g. if the leaf of choice in a decision tree contains 3 positive and 2 negative cases, pass 3/5 instead of P. That way, each "guess" by the base models would automatically be weighted. If model 1 is sure about a result, the others is not (and predicts a different outcome), then the prediction of model A would be favored.

Best regards
Hanspeter

Answers

  • Options
    IngoRMIngoRM Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University Professor Posts: 1,751 RM Founder
    Hi,

    I fully agree that passing the confidences in addition or even instead could definitely improve the quality of the complete model. I suppose that the original paper only passed the predictions and we probably sticked to this description. In order to not break compatibility and allow those different options, I would suggest to add a new parameter which allows to choose between "predictions only", "confidences only", or "predictions and confidences".

    Thanks for sending this in. Cheers,
    Ingo
  • Options
    spitfire_chspitfire_ch Member Posts: 38 Maven
    Hi Ingo,

    thanks for your reply. Making this optional totally makes sense. That would also allow to directly investigate whether predictions vs. confidence do make a difference, and to do some more tweaking of the model in development.

    Cheers,
    Hanspeter
Sign In or Register to comment.