Bagging optmization reduced performance

ravtalravtal Member Posts: 9 Contributor II
edited January 2020 in Help

I am running an attached dataset to measure the performance of the model. Decision tree gave me a good accuracy value, however, when I used a bagging operator to increase the performance of the model, the output reduced the performance accuracy.
Could anyone help me with what changes I need to make in the model so that accuracy is optimized?

Note: dataset has no attribute tables, uncheck "first row as names" and column operator value to "," while importing the dataset.



  • Options
    MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,508 RM Data Scientist
    Hi @ravtal,
    this sounds like you overtrained your decision tree. Did you check for it?

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • Options
    ravtalravtal Member Posts: 9 Contributor II
    You mean I should reduce the depth?
  • Options
    varunm1varunm1 Moderator, Member Posts: 1,207 Unicorn
    Hello @ravtal

    As @mschmitz said, it might be due to overfitting. DId you try hyperparameter optimization using "optimize parameter grid operator"? You can search for the best hyperparameters for your algorithm and reduce overfitting. 

    Be Safe. Follow precautions and Maintain Social Distancing

  • Options
    ravtalravtal Member Posts: 9 Contributor II
    @varunm1 Ok, but do you think model design is good?
Sign In or Register to comment.