RapidMiner Wisdom Banner

Explain Prediction inside Cross Validation : Error

varunm1varunm1 Moderator, Member Posts: 999   Unicorn
edited August 2019 in Help
Hello,

I am working on a process that needs explain prediction operator inside the testing process of a cross-validation operator. I use forward feature selection method inside cross-validation (training) that selects relevant features that support predictions, but when I provide those as input to the "tra" port of explain predictions and the "tes" of cross-validation to "tes" of explain predictions operator it is throwing an error. 

I am looking to select attributes given by forward selection and apply the same attributes to the testing port of explain predictions as well so that it doesn't throw an error. Currently, all the features are going into the testing port of explain predictions but not into the training port as training is done on feature selected attributes.

The process is working fine in the absence of explain predictions. I attached data .ioo files which can be placed in the repository directly and process in this thread.

I worked with the same process using the automatic feature engineering operator and didn't face any issue, my understanding is that automatic feature engineering gives features instead of feature selected example set like forwarding selection. In that, I just used apply feature set operator to make train and test data going into the explain predictions operator have the same attributes.



@IngoRM

Thanks for your suggestions.
Tghadially

Best Answer

  • IngoRMIngoRM Posts: 1,715  RM Founder
    Solution Accepted
    Hi,
    Another idea to use the "weights" produced by the feature selection and deliver them to the through port.  In the testing part you could then use the operator "Select by Weights" to replicate the same attribute set.
    Hope this helps,
    Ingo
    RapidMiner Wisdom 2020
    February 11th and 12th 2020 in Boston, MA, USA

Answers

  • Telcontar120Telcontar120 Moderator, RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 1,293   Unicorn
    Did you try a Remember/Recall combination on the attributes that need to be passed through from the train to the test set?  That could work.
    Brian T.
    Lindon Ventures 
    Data Science Consulting from Certified RapidMiner Experts
    Tghadiallysgenzer
  • varunm1varunm1 Moderator, Member Posts: 999   Unicorn
    Hello Brian,

    I am passing the training data exactly selected by the feature selection to explain predictions through the "thru" port of CV. My only issue is with testing data. The testing data in cross-validation is coming from the whole dataset, so it consists of all attributes, this is making explain predictions throw an error based on my understanding. I am just looking a way to filter attributes in test data based on attributes in train.

    I will see what I can do with Remember/Recall.
    Tghadially
  • varunm1varunm1 Moderator, Member Posts: 999   Unicorn
    Thanks @IngoRM 

    I have one question. When I pass the dataset with all attributes to apply model it doesn't throw an error, but for explain predictions it is throwing an error as specified earlier. Does apply model filters out attributes that are not used in model building automatically and explain predictions is unable to do that?

    Thanks for your suggestions.
    Tghadially
  • IngoRMIngoRM Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University Professor Posts: 1,715  RM Founder
    The pre-flight checks for Explain Prediction are a little more strict than those for Apply Model indeed.  In the next version, we already made the type checks a bit less strict, but I will also look into the restrictions for supersets...
    RapidMiner Wisdom 2020
    February 11th and 12th 2020 in Boston, MA, USA

    varunm1Tghadially
  • varunm1varunm1 Moderator, Member Posts: 999   Unicorn
    Thanks, @IngoRM that solves my question.
    Tghadially
Sign In or Register to comment.