RapidMiner 9.7 is Now Available

Lots of amazing new improvements including true version control! Learn more about what's new here.

CLICK HERE TO DOWNLOAD

About H2O Deep Learning Operator

RapidminerpartnerRapidminerpartner Member Posts: 26 Contributor II
edited April 23 in Help

Removed by the writer

Answers

  • hughesfleming68hughesfleming68 Member Posts: 298   Unicorn
    Did you set your random seeds to be the same? In Tensorflow, I have to jump through many hoops to get repeatable results including switching off multithreading and setting seeds for numpy etc. This is in Python. You would have to do the same in Rapidminer.

    It is common to see a drop in performance when you take away the randomness. The other alternative would be to average the results over multiple runs. A lot of these problems are data dependent but common to all deep learning operators.
  • hughesfleming68hughesfleming68 Member Posts: 298   Unicorn
    edited April 22
    The H20 examples from the link have their seed set to 1234. Set the seed to the same in Rapidminer. You would want to leave your layers and epoch the same and be 100% certain that you are spitting your data in exactly the same way.

    If that does not fix your problem then it is safe to assume that there are differences under the hood between versions. Do you have to use H20?
  • hughesfleming68hughesfleming68 Member Posts: 298   Unicorn
    Differences due to randomness can be a headache to track down. At the same time, you will want to vary your seeds to make sure that your good results were not down to randomness as well. Sometimes you get lucky and it is doubled edged. It is one of the more troublesome aspects of deep learning.
  • hughesfleming68hughesfleming68 Member Posts: 298   Unicorn
    edited April 22
    I don't have Rapidminer in front of me right now but if I remember correctly the default seed when you select a fixed seed is 1992. Make sure that both are set the same and go from there. This still might not solve your problem.
  • tkeneztkenez Employee, RapidMiner Certified Expert, Member Posts: 15  RM Product Management
    Hello there,

    I think the main reason you're experiencing such a difference between the two models is that they are using a very different version of H2O. Right now, models in RapidMiner that use H2O under the hood (Deep Learning being one of them) are running with a dated version of the library. On another note, H2O does not prioritize compatibility of models between two releases, so it is very muxh expected that models built with two different versions of the library produce different results.

    The RapidMiner engineering team is currently working on upgrading the library to the most recent stable version, so you can expect that improvement soon. But this is not a continuous stream of updates, so identical behavior can only be expected until the next stable version of the h2o library is released.

    What you could do when comparing the two is to use the exact same H2O library version with Python as the one used within RapidMiner.

    Hope this helps,
    Tamas
    hbajpai
  • hughesfleming68hughesfleming68 Member Posts: 298   Unicorn
    To be honest, the fact that H20 does not prioritise compatibility between versions is a flaw. I have been hesitant to use it for that reason. 
    varunm1hbajpaimschmitz
  • RapidminerpartnerRapidminerpartner Member Posts: 26 Contributor II

    Dear everyone.

    I am really sorry.

    I want to delete this post, but there is no way to delete it.

    There is seriously misunderstanding.

    I made important mistake in calculating MAPE.

    I said above h2o operator in rapidminer is excellent, which turned out to be "No"

    that is, h2o operator in rapidminer is poorer than tensorflow deep learning, which I checked now.

    Sorry for all the misunderstanding and confusion.

    Also thank you for your comment above from all of you.

    So, as it is said above, h2o operator in rapidminer has different version than the one in python

    Also there were good comments and advise, knowledge from all of you

    Thank you for those and have a nice day.

    Thanks

  • hughesfleming68hughesfleming68 Member Posts: 298   Unicorn
    It is pretty poor form to delete your posts. Now all responses are meaningless. I am not sure how much time you have spent studying deep learning architectures but it is not that simple. It isn't plug an play.
    Jasmine_varunm1
  • RapidminerpartnerRapidminerpartner Member Posts: 26 Contributor II
    edited April 24

    to hughesfleming68:

    have a nice weekend and see you~

Sign In or Register to comment.