How to Change Prediction Graphs in Model Simulator

User36964User36964 Member Posts: 11 Contributor I
edited December 2018 in Help
Hi to all,

I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.  

When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "

Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."

Is there a way to fix the model simulators output for only Prediction 1 ?  

Answers

  • mschmitzmschmitz Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 2,045  RM Data Scientist
    Hi,
    have you set the class of highest interest in the second step?

    BR,
    Martin
    - Head of Data Science Services at RapidMiner -
    Dortmund, Germany
    User36964
  • User36964User36964 Member Posts: 11 Contributor I
    Yes, I choose the class of interest as Prediction:1 for both cases
Sign In or Register to comment.