Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.
Confidence or Prediction Intervals
Dear All,
When reporting a performance metric (ie AUC) of a model that was trained on a single data set and tested on a hold-out set, what is the proper way to assess its variance? Calculating the confidence intervals of the AUC or the prediction intervals?
Many thanks
Nikos
When reporting a performance metric (ie AUC) of a model that was trained on a single data set and tested on a hold-out set, what is the proper way to assess its variance? Calculating the confidence intervals of the AUC or the prediction intervals?
Many thanks
Nikos
Tagged:
0
Answers
This is a bit tricky as they both are centered around the same value, but a prediction interval is wider than the confidence interval. So in case of error reduction or a better accurate prediction you can go with prediction intervals.
Varun
https://www.varunmandalapu.com/
Be Safe. Follow precautions and Maintain Social Distancing
I am checking that, and I see no option in RM for that. I think it is a bit complicated to calculate.
@mschmitz or @IngoRM any comments on this.
Thanks
Varun
https://www.varunmandalapu.com/
Be Safe. Follow precautions and Maintain Social Distancing