Parameter Optimizer Problem

hagen85hagen85 Member Posts: 18 Contributor II
edited November 2018 in Help
Hi There,
I am using an evolutionary parameter optimizer to determine the best parameters for my neural network. The neural network is embedded in a sliding window X-Validation with cumulative learning. After the process is finished I get the parameter set and a performance of 67,33 %. When I actually apply the parameters I only get 65,85 %, how is that possible? Isn´t that supposed to be the same?
Regards
Hagen

Answers

  • MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
    Hi,

    it is hard to tell anything without knowing your process (please see http://rapid-i.com/rapidforum/index.php/topic,4654.0.html ). However, yes, it is possible that you get different results, since the exerimental performance is always an approximation of the true performance. So depending on your setup, it is perfectly possible that you get (usually small) differences in the performance.

    Best, Marius
Sign In or Register to comment.