The updates in 9.9 power advanced use cases and offer productivity enhancements for users who prefer to code.


How to use an output of one model as an input to another one?

MatoseMatose Member Posts: 3 Newbie
edited December 2020 in Help
So a noob question here...

I have this dataset where I have 10 attributes and two labels, say, Label X and Y.  I built individual models for each labels using  linear regression and performance was okay but it could be better....but I noticed that the two labels, X and Y are highly correlated and if I use label X as a feature when modeling for Label Y, the accuracy of the model improves considerably. but I can not use Label X as an input feature in deployment as I don't know the value yet. and trying to Predict Label X using an individual model and using it to predict Label Y in another model would not consider the error propagation and thus, it may result in a model performance that is  too optimistic. 

so I am trying to find a way to develop a sort of parent model that allows me to use the 10 features I have to first predict Label X and then use that result and again, the 10 features, to predict Label Y. I read that Stacking requires that the base learner and the stacking model learner have similar labels so I cannot use that.  Is there any other way to do this? 

Thank You for taking the time to help. 


  • jacobcybulskijacobcybulski Member, University Professor Posts: 388   Unicorn
    If you have two highly correlated numerical labels then most likely if one can be predicted from the remaining attributes, so can the other, and both will still be highly correlated. You can set X as a "label" and Y as role "hidden" (type it in), run the model to predict X, then change the roles of label(X) to "result1" and Y to "label", run the second model and now you have both predictions in a single row but we have not used one to predict the other. It is generally not a good idea to pipe one prediction as a predictor to another model as errors would accumulate. 
Sign In or Register to comment.