RapidMiner Wisdom Banner

Which values can i use for the optimize Parameters Operator in a deep Learning Prediction Model

JunoSitalJunoSital Member Posts: 11 Contributor I
edited December 2019 in Help

I am currently implementing a performance Test with the gradient boosted tree und the deep learning algorithm for a prediction model.
For the GBT i used the number of trees, maximal depth and the learning rate for the optimize parameters operator. But I have no idea with witch parameters I can try to optimize my deep learning model.

Thanks for your help an a happy new year  :)

Best Answer

  • varunm1varunm1 Posts: 999   Unicorn
    Solution Accepted
    Hello @JunoSital

    Sorry for delayed response. The option is available in "Expert Paramters"  as "input drop out ratio". One thing I am not sure is why DL operator is forcing you to set a drop out ratio as it is not mandatory. Try setting input drop out ratios as 0.2 for each layer here.


  • varunm1varunm1 Moderator, Member Posts: 999   Unicorn
    Hello @JunoSital

    As there are many hyperparameters that can be tuned in a DL, two important parameters are learning rate and epochs You can tune these first. Then you can change the number of hidden nodes and hidden layers, unfortunately, optimize parameters don't support tuning these two and you need to do this manually. Start with a simple network and then grow your network to see how its performing. 
  • JunoSitalJunoSital Member Posts: 11 Contributor I
    Hello @varunm1,

    thank you very much for your fast reply.
    I'll try this out right now and give you a feedback.

    I wish you a happy new year.
  • JunoSitalJunoSital Member Posts: 11 Contributor I
    Hello @varunm1,

    i set the following parameters:

    learning rate: 0.01 to 0.2 in 19 Steps (optimize parameters)

    epochs: 10 to 1000 in 30 Steps (optimize parameters)

    hidden layer sizes: 50/50 (manually)

    After 20 minutes I received the following error:

    Model training error (H2O).
    Error while training the H2O model: Illegal argument(s) for DeepLearning model: ERRR on field: _hidden_dropout_ratios: Must have 2 hidden layer dropout ratios.

    Where can I set these hidden dropout ratios?

    Thank you for your help!
  • varunm1varunm1 Moderator, Member Posts: 999   Unicorn
    Hello @JunoSital

    I am away from my computer. If you have two layers then you need to set drop out for each layer in "hidden dropout ratios option" you can select same or different dropout ratios for each layer.

    I will check once I reach my PC. In the mean time you can try to set two rations in  hidden dropout ratios option in deep learning operator model parameters.
  • JunoSitalJunoSital Member Posts: 11 Contributor I
    Hello @varunm1,

    thanks again for your help.
    In Parameter section from the deep learning algorithm i can't find the parameter "hidden dropout ratios".
    Here are the available settings:

    The "hidden dropout ratios" parameter is also not found in the expert parameters.

    Kind regards
Sign In or Register to comment.