Performance Vectors (and Confusion Matrix)
I have a process in which I train a classifier and test its performance afterwards. My example set has a binominal label and when I output the data generated by the binominal performance evaluator I get a nice view with information about (in this case) accuracy, precision and recall.
My problem is that the wrong label value is treated as positive class so is there an easy way to change that? Or is there an easy way to access the elements of the confusion matrix?