I use the Preprocessing-> Normalization operator to normalize the data. However, I do not know what to do to get the data back in the original units after obtaining the predictions.
I think you might be experiencing the same issue that I had earlier. See Ingo's response in the "Normalization Issue" thread in the Problem and Support section. Using the simple fixes in that Ingo suggested, i.e. adding the IOSelector and ModelGrouper operators, fixed the normalization issue that I was having.
In the thread you mention, the problem was how to embeed the normalization model together with the learming model. In constrast, my question is related to how to de-normalize the model to get meaninful output for the domain experts.
Thank you
0
IngoRMAdministrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University ProfessorPosts: 1,751 RM Founder
Hi,
first question is: why do you not only normalize the base attributes skipping the label? If you want to include the label into the normalization, there is no operator for this "de-normalization" but you could set up a feature generation for transforming the predicted value back to the original range. You can obtain the necessary parameters from the visualization of the normalization model. Here is a complete example:
Would it be OK to apply normalization only to the regular (predictor) attributes and not to the label (prediction)?
Regards
Ben
0
sgenzerAdministrator, Moderator, Employee, RapidMiner Certified Analyst, Community Manager, Member, University Professor, PM ModeratorPosts: 2,959 Community Manager
Why doesn't the "de-normalize" operator work for you in this case? You just need to feed the original normalize preprocessing model into it and it returns the invert model, which can then be applied to the normalized data and you should get the denormalized data back out.
Brian T. Lindon Ventures Data Science Consulting from Certified RapidMiner Experts
I see, but I am still somewhat confused. Consider the process below to train a model and then to be used on new data. How to connect such that the new data is similar normalized using the normalization model obtained by the trained data, then the Deep Learning model applies on it to generate the prediction, followed by a de-normalize operator to get it back to its initial (unnormalized value)? I hope my process is sound though.
Answers
I think you might be experiencing the same issue that I had earlier. See Ingo's response in the "Normalization Issue" thread in the Problem and Support section. Using the simple fixes in that Ingo suggested, i.e. adding the IOSelector and ModelGrouper operators, fixed the normalization issue that I was having.
Best of luck...
I found the thread that you mention at http://rapid-i.com/rapidforum/index.php/topic,211.0.html
However, I believe that my problem is different than the issue that it is solved there.
In the thread you mention, the problem was how to embeed the normalization model together with the learming model.
In constrast, my question is related to how to de-normalize the model to get meaninful output for the domain experts.
Thank you
first question is: why do you not only normalize the base attributes skipping the label? If you want to include the label into the normalization, there is no operator for this "de-normalization" but you could set up a feature generation for transforming the predicted value back to the original range. You can obtain the necessary parameters from the visualization of the normalization model. Here is a complete example: Cheers,
Ingo
Dear Ingo
Would it be OK to apply normalization only to the regular (predictor) attributes and not to the label (prediction)?
Regards
Ben
Hello @cyborghijacker -
Just FYI about the forum...if you want to call someone's attention, use the @ handle with the username. Very effective.
Scott
My wife considers me 'de-normalized!' yuk yuk yuk!
Why doesn't the "de-normalize" operator work for you in this case? You just need to feed the original normalize preprocessing model into it and it returns the invert model, which can then be applied to the normalized data and you should get the denormalized data back out.
Lindon Ventures
Data Science Consulting from Certified RapidMiner Experts
I see, but I am still somewhat confused. Consider the process below to train a model and then to be used on new data. How to connect such that the new data is similar normalized using the normalization model obtained by the trained data, then the Deep Learning model applies on it to generate the prediction, followed by a de-normalize operator to get it back to its initial (unnormalized value)? I hope my process is sound though.
@sgenzer Thank you. I just used it!
Maybe this thread helps: http://community.rapidminer.com/t5/RapidMiner-Studio-Forum/Reverse-map-a-nominal-to-numerical-transform/m-p/39662