Model validation performance

Muhammed_Fatih_Muhammed_Fatih_ Member Posts: 93 Maven
Hello together, 

which validation performance (with regard to learning and testing phase) of classification models is quicker? Cross-calidation or the classical split validation (with a 70:30 split)? 

Thank you in advance for your help! 

Best regard, 

Fatih

Best Answers

  • varunm1varunm1 Moderator, Member Posts: 1,207 Unicorn
    edited November 2019 Solution Accepted
    Split validation is quicker, it builds model only once and then tests on the dataset. Incase of cross-validation, the model is built k+1 times (K is the number of folds).

    I didn't encounter any special case where cross-validation performed faster than split. I don't think it happens if all other settings are the same (Feature selection, hyperparameters, etc).

    Maybe if you use a processor with multiple cores and each cross-validation process is run parallelly, then there might be a chance based on the fold sizes. But in general, the above is fine.

    Hope this helps.
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

    Pavithra_RaosgenzerTghadiallyMuhammed_Fatih_
  • mschmitzmschmitz Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,231 RM Data Scientist
    Solution Accepted
    Hi,
    Cross-validation is the more accurate estimator of the true model performance.

    Best,
    Martin
    - Head of Data Science Services at RapidMiner -
    Dortmund, Germany
    Muhammed_Fatih_

Answers

  • Muhammed_Fatih_Muhammed_Fatih_ Member Posts: 93 Maven
    Hi varunm, 

    thank you four your answer! An additional question - is there a possibility to say that one of the two validation processes (Split Validation vs. Cross-Validation) performs better in general with regard to learning and testing? 

    Best regards! 
Sign In or Register to comment.