The Altair Community is migrating to a new platform to provide a better experience for you. The RapidMiner Community will merge with the Altair Community at the same time. In preparation for the migration, both communities are on read-only mode from July 15th - July 24th, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here.
Options

De-normalizing

alfileresalfileres Member Posts: 6 Contributor II
edited November 2018 in Help
Hi all,

I use the Preprocessing-> Normalization operator to normalize the data. However, I do not know what to do to get the data back in the original units after obtaining the predictions.

any hint?

Thank you!

Answers

  • Options
    DarrellDarrell Member Posts: 16 Maven
    alfileres,

    I think you might be experiencing the same issue that I had earlier.  See Ingo's response in the "Normalization Issue" thread in the Problem and Support section.  Using the simple fixes in that Ingo suggested, i.e. adding the IOSelector and ModelGrouper operators, fixed the normalization issue that I was having.

    Best of luck...
  • Options
    alfileresalfileres Member Posts: 6 Contributor II
    Dear Darrell,

    I found the thread that you mention at http://rapid-i.com/rapidforum/index.php/topic,211.0.html
    However, I believe that my problem is different than the issue that it is solved there.

    In the thread you mention, the problem was how to embeed the normalization model together with the learming model.
    In constrast, my question is related to how to de-normalize the model to get meaninful output for the domain experts.

    Thank you
  • Options
    IngoRMIngoRM Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University Professor Posts: 1,751 RM Founder
    Hi,

    first question is: why do you not only normalize the base attributes skipping the label? If you want to include the label into the normalization, there is no operator for this "de-normalization" but you could set up a feature generation for transforming the predicted value back to the original range. You can obtain the necessary parameters from the visualization of the normalization model. Here is a complete example:

    <operator name="Root" class="Process" expanded="yes">
        <operator name="ExampleSetGenerator" class="ExampleSetGenerator">
            <parameter key="attributes_lower_bound" value="0.0"/>
            <parameter key="target_function" value="sum"/>
        </operator>
        <operator name="NoiseGenerator" class="NoiseGenerator">
            <list key="noise">
            </list>
        </operator>
        <operator name="Label2Regular" class="ChangeAttributeRole">
            <parameter key="name" value="label"/>
        </operator>
        <operator name="Normalization" class="Normalization">
            <parameter key="return_preprocessing_model" value="true"/>
        </operator>
        <operator name="Regular2Label" class="ChangeAttributeRole">
            <parameter key="name" value="label"/>
            <parameter key="target_role" value="label"/>
        </operator>
        <operator name="LinearRegression" class="LinearRegression">
            <parameter key="keep_example_set" value="true"/>
        </operator>
        <operator name="ModelApplier" class="ModelApplier">
            <list key="application_parameters">
            </list>
        </operator>
        <operator name="ChangeAttributeName" class="ChangeAttributeName">
            <parameter key="new_name" value="pred"/>
            <parameter key="old_name" value="prediction(label)"/>
        </operator>
        <operator name="FeatureGeneration" class="FeatureGeneration">
            <list key="functions">
              <parameter key="mult_pred" value="*(pred,sqrt(const[47.175602123229964]()))"/>
              <parameter key="transformed_pred" value="+(mult_pred,const[24.811560028833878]())"/>
            </list>
            <parameter key="keep_all" value="true"/>
        </operator>
    </operator>
    Cheers,
    Ingo
  • Options
    cyborghijackercyborghijacker Member Posts: 19 Contributor II

    Dear Ingo

     

    Would it be OK to apply normalization only to the regular (predictor) attributes and not  to the label (prediction)?

     

    Regards

    Ben

  • Options
    sgenzersgenzer Administrator, Moderator, Employee, RapidMiner Certified Analyst, Community Manager, Member, University Professor, PM Moderator Posts: 2,959 Community Manager

    Hello @cyborghijacker -

     

    Just FYI about the forum...if you want to call someone's attention, use the @ handle with the username.  Very effective.  :)

     

    Scott

  • Options
    Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,761 Unicorn

    My wife considers me 'de-normalized!' yuk yuk yuk!

  • Options
    Telcontar120Telcontar120 Moderator, RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,635 Unicorn

    Why doesn't the "de-normalize" operator work for you in this case?  You just need to feed the original normalize preprocessing model into it and it returns the invert model, which can then be applied to the normalized data and you should get the denormalized data back out.

     

    Brian T.
    Lindon Ventures 
    Data Science Consulting from Certified RapidMiner Experts
  • Options
    cyborghijackercyborghijacker Member Posts: 19 Contributor II

    I see, but I am still somewhat confused. Consider the process below to train a model and then to be used on new data. How to connect such that the new data is similar normalized using the normalization model obtained by the trained data, then the Deep Learning model applies on it to generate the prediction, followed by a de-normalize operator to get it back to its initial (unnormalized value)? I hope my process is sound though.

     

    rapidminer_qn1.png

  • Options
    cyborghijackercyborghijacker Member Posts: 19 Contributor II

    @sgenzer Thank you. I just used it!

Sign In or Register to comment.