Optimize Parameter/Log for Decision Tree

PrenticePrentice Member Posts: 66 Maven
edited June 2019 in Help
Hello,

I'm trying to compare different algorithms and try to find the optimal parameters. For this, I use the optimize parameter operator. However, when I try to apply it to a Decision Tree classifier it doesn't work. Or at least, the log function doesn't work properly, but I also get the same accuracy for each setting. 
But when I want to plot it in a log function it only shows one value for each parameter.

Does somebody know why this happens and how I can solve it?

Thanks
Prentice
Tagged:

Best Answer

Answers

  • sgenzersgenzer Administrator, Moderator, Employee, RapidMiner Certified Analyst, Community Manager, Member, University Professor, PM Moderator Posts: 2,959 Community Manager
    hi @Prentice so I looked at your process and if I put a breakpoint right before Optimize Parameters, I see an ExampleSet with SIX examples. There is no way you are going to build a machine learning model with such a small data set. This is why your "accuracy" is always the same, etc..

    Scott

  • PrenticePrentice Member Posts: 66 Maven
    Hello @sgenzer,

    Aah, so this means that my actual ExampleSet is also too small? I've got like 150 examples with 4 classes, that does explain it. It just goes each time to the same branch, hence having the same confidence.
    But why does the Log operator not work when I try to use different settings for the Decision Tree? It shows the combinations in the Optimize Parameters results, but it shows the same combinations for each iteration for the results of Log. This only happens when I use the Decision Tree.

    Prentice
  • PrenticePrentice Member Posts: 66 Maven
    @jczogalla

    Wow thanks, sometimes the most simplest of things get overlooked. I can't believe I spent hours trying find the cause and it's something like this haha. 
    Anyway, thanks for your help!

    Prentice
Sign In or Register to comment.