RapidMiner 9.7 is Now Available
Lots of amazing new improvements including true version control! Learn more about what's new here.
How to Change Prediction Graphs in Model Simulator
I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.
When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "
Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."
Is there a way to fix the model simulators output for only Prediction 1 ?