RapidMiner 9.7 is Now Available
Lots of amazing new improvements including true version control! Learn more about what's new here.
"Maximum theoretical accuracy"
I've been wondering, is there a theory on the maximum accuracy one can achieve on a given example set without over-fitting? I'm looking for something that could tell me, "whatever method you use to do regression/classification on this dataset, you'll never achieve over x% accuracy" , or something along those lines.
I tried a quick Google search, but didn't find anything along those lines, so I'm asking the experts now
Thanks for any leads on the subject!