Different results outside and inside optimize parameter operator

chaos85chaos85 Member Posts: 1 Contributor I
edited November 2018 in Help
Dear RapidMiner community,
I have a problem using the optimize parameter operator.
I have a dataset (doesn't really matter which one) and I want to optimize say a neural net learning rate.

The problem that I found was, that the optimized parameter determined by the optimizer would give different results if used outside the optmizer.

To give a concrete example, I use the optimize parameter operator for the learning rate of a neural net, using x-validation.
Let's say, the optimizer finds that a learning rate of 0.6 is optimial, and my performance vector gives an f-measure of 0.7

When I use an x-validation operator on the same data, with the same settings and local seed but without the optimizer, with the learning rate of 0.6 that my optimizer found, I get a different performance!
How can this be? Any suggestions? I even unchecked the shuffle box of the neural net.


Best regards,
Gabriel

Answers

  • MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
    I can't reproduce this behaviour. Are you sure that the second Neural Net uses the same parameters, and that you use a local random seed for all X-Validations and Neural Nets?
    If you are using the Set Parameters operator remember, that you have to fill the name map, and in this map the left column specifies the operator names in the parameter set, i.e. inside the optimization, and the right one the name of your second neural net on the outside of the optimization. If the behaviour still occurs after checking all the above, please provide an example process and example data.

    Cheers, Marius
Sign In or Register to comment.