image

🎉 🎉 RAPIDMINER 9.10 IS OUT!!! 🎉🎉

Download the latest version helping analytics teams accelerate time-to-value for streaming and IIOT use cases.

CLICK HERE TO DOWNLOAD

"Problem with Simple Decision Tree Analysis"

Legacy UserLegacy User Member Posts: 0 Newbie
edited May 2019 in Help
I just downloaded RapidMiner and am evaluating it for some data mining work we are doing.  I went through the tutorials and choose a a simple decision tree exercise with some actual usage data that I have  (1300 users, 16 attributes, predict use of one tool).  I get the Use variable split (100/1200) as a result of the DecisionTree operator but no partitions.  Have checked and data is being read fine, parameters all look consistent witn the tutorial examples.  I must be missing something obvious.

Any help gratefully accepted,

Dr. Richard Y Flanagan
Rohm and Haas Company
Tagged:

Answers

  • TobiasMalbrechtTobiasMalbrecht Moderator, Employee, Member Posts: 291  RM Product Management
    Hi Richard,

    welcome to our forum! Do I understand you right, in that the decision tree you get consists of only one node, i.e. the root? You may force the decision tree learner to prune less by setting the parameters of the learner appropriately. You get some help concerning the parameters from the tooltips displayed when hovering your mouse over the parameters. Although there is normally no general rule to setting these parameters (i.e. the optimal parameter settings depend crucially on your data), you may try be setting the parameter [tt]minimal_leaf_size[/tt] to a smaller value and set the parameter [tt]no_pruning[/tt] to true.

    Hope this was helpful in someway.
    Regards,
    Tobias
  • Legacy UserLegacy User Member Posts: 0 Newbie
    Thanks Tobias, that worked along with cleaning up some missing fields.  Soemhow, I knew that having access to a system had to affect how much you use it, glad I can prove it now.
Sign In or Register to comment.