Hi to all,
I'm building a model by random forest. To find the best fitting model I use cross validation. According to the cross validation results my model has 74.35% accuracy and 0.830 AUC. These are good performance results for the model.
I also run the model simulator. The output of the model simulator list the contradicting and supporting parameters but indicates that my results are not confident. And shows the confidence level as 55%. I wonder how this confidence level is calculated and what does it indicate? How can AUC is high when the confidence level is this low ?
Another issue is the optimize option of the model simulator. In the output window of the model simulator there is an optimize button. This option optimize the input parameters in order to increase the confidence level. The confidence level of my model increased to 90%. When confidence level increases some of the contradicting and supporting parameters change. Is there a way to see the newly formed model (by the optimization) and its performance indicators?