Options

Gradient boosting weights

k_vishnu772k_vishnu772 Member Posts: 34 Contributor I
edited June 2019 in Help
Hi all,
I ran the model with gradient boosting algorithm in rapid miner and I have seen the weights generated for each input parameter and some of them have zero weight does that mean that those are eliminated from the model.does that mean it feature selects the parameters with positive weight?
Could you please help me in this.

Regards
Vishnu

Best Answer

  • Options
    MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,507 RM Data Scientist
    Solution Accepted

    Hi,

     

    yes and yes, i would call it feature selection. I often use this (or the weights of an RF) for feature selection. Just use a Select by Weights operator afterwards.

     

    Best,

    Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany

Answers

  • Options
    MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,507 RM Data Scientist

    Hi @k_vishnu772,

    the weights are caculated in the aftermath. Basically you run over all trees an calculate the influence of each cut and sum over it. A value of 0 in the weights indicates, that this attribute was never used for any split.

     

    Cheers!

    Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • Options
    k_vishnu772k_vishnu772 Member Posts: 34 Contributor I

    @mschmitz

    so attibute of weight zero means even if i remove those in the model i should be able to get the same results right ?

    and it is a kind of feature selection

Sign In or Register to comment.