It looks like you're new here. If you want to get involved, click one of these buttons!
can you maybe post your proess?
to be honest your process looks a bit odd in general. You apply the k-nn on the learned data in the GA, which leads to overtraining. Further you used a split validation by hand w/o taking the Ga into account, this is again something which yields to overtraining. I would suggest to put a cross validation around everything. It is further a bit strange to use a naive bayes to generate features and a k-NN for classification - but if it works, it works.
RapidMiner AI Hub
Automated Data Science
Training Classes & Certification
ML Algorithm Reference
Educational License Program