Options

Why do the relative errors change?

s_sorrenti3s_sorrenti3 Member Posts: 4 Contributor I
edited December 2018 in Help

Hello,
By executing the neural net with cross validation and the linear regression with cross validation together in the same process I get the following “relative errors”:

together.png
errors together.png

By executing only the neural net with cross validation separately in a single process I get the following relative error:

alone.png
error.png

Why do the relative errors change if I run learning models together or if I run them separately?

 

 

Tagged:

Answers

  • Options
    SGolbertSGolbert RapidMiner Certified Analyst, Member Posts: 344 Unicorn

    Hi,

     

    This surely is caused by a different random seed in both cases. If you want to avoid said behaviour, you have to tick the "use local random seed" option in the Cross-Validation operator.

     

    However, the differences caused by changing the seed should be minimal if your process is correct. In your case it looks as if some neural nets models are not converging, therefore you have very dispair results in each fold of the cross-validation. I think you have to tune up your models and their optimization options.

     

    Regards,

    Sebastian

  • Options
    s_sorrenti3s_sorrenti3 Member Posts: 4 Contributor I

    I have selected the "use local random seed" option in the Cross-Validation operator.
    By executing the learning models with cross validation in the same process I get the following relative error: 73,34%.
    By executing only the neural net with cross validation separately in a single process I get the following relative error: 203,41%.

  • Options
    Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,761 Unicorn

    @s_sorrenti3 type in a seed like '1992' for each cross validation and try again. If that doesn't work, follow the rules of the Community by posting your XML and data.

Sign In or Register to comment.