Options

Gradient Boosted Tree don's give the final prediction

sbrnaesbrnae Member Posts: 2 Newbie
Hello Rapidminer Community !

I want to ask regarding Gradient Boosted model that i used for my study on predicting corporate default risk. My dependent variable is default and non default and i use number 1 as default and 0 as non default. I already setup the data type as binominal. After i call the related operators such as select attributes, set roles and cross validation, all tree at the end of the result don't show the branches either it will become 1 or 0 as i assigned before. Below i share one of the Gradient Booted models 


So, my question is how does this happened and is there any way to solve this problem ? I really hope someone can help me to solve this problem because its important for my study since the due is so near. I'm really open to anyone to answer my question. Thank you in advance. 

Answers

  • Options
    MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,511 RM Data Scientist
    Hi,
    I think you need to check how a GBT works to understand this. The leaves here are just not the label you search for.

    In each iteration of the tree, the tree tries to predict:
    real_value - \sum previous_predictions.

    where real value 0,1. So if the first tree predicts in a 0.7, where the real value is 1, the second tree tries to predict 0.3 and so on.*

    Please check this guide: https://community.rapidminer.com/discussion/36379/a-practical-guide-to-gradient-boosted-trees-part-i-regression . It should help (even though its on regression).

    Best,
    Martin

    *: The reality is a bit different, because it depends on the chosen lost measure. Also note that there is somewhat a 0-th run, where you just predict the average of the classes. Thus you do not start with 0/1.



    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
Sign In or Register to comment.