"Early stopping based on cross-validation"

ilaria_goriilaria_gori Member Posts: 15 Maven
edited June 2019 in Help
Dear all,
I am using RapidMiner Studio 5.3 and I would like use the early stopping based on cross-validation (see http://en.wikipedia.org/wiki/Early_stopping#Early_stopping_based_on_cross-validation) in the training of neural networks. I don't find in RapidMiner any operator which could help me..Is this feature implemented in some way? Or do the developers plan to implement it?

Thank you!



  • Options
    frasfras Member Posts: 93 Contributor II
    The current implementation of the NeuralNet-Operator uses the
    parameter epsilon as stopping criterium.  No matter which criteria you
    apply you always want to avoid overfitting your data.  Finally, the
    ultimative way to check whether your model overfits your data is using
    the operator X-Validation together with a Performance-Operator.  It
    offers the possibility to check other algorithms also that may more
    appropriate for you data.
  • Options
    MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
    Let me add that you can optimize the parameters of the Neural Net operator: while it is not possible to perform any kind of early stopping, you can try different configurations for the Neural Net, validate them with the X-Validation operator, and at the end select those parameters that worked best.

    To help you to automate this you can use the Optimize Parameters operator.

    Best regards,
Sign In or Register to comment.