Interpreting cross validation
I'm a total beginner, so please bear with me.
I have a process set upped, with a cross validation at the end. Inside of it a have Deep Learning, Apply Model and Performance operators. So far so good, so after I run (4h later ) I get a Confusion Matrix and a Accuracy. And here is my question:
So I have accuracy: 35.42% +/- 47.83% (mikro: 35.42%)
Accuracy is average accuracy of all models trained, right?
So is +/- 47.83% variance?
And for the confusion matrix, is it from the last model trained, or is it a some kind of summary of all the runs?
To be exact I use k-fold cross validation, so maybe my understanding of that process is wrong.
Thx in advance and sorry for noob question!