Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.

KNN prediction performance

mehrdadmehrdad Member Posts: 3 Contributor I
edited November 2018 in Help

Hello All,

 

I am new here and I need your help. Actually, I have a training dataset and a test data set to be classified via kNN classification.  Training---->Knn--->apply model and Test --->apply model... I do not want use cross-validation or split validation but can someone tell me how to measure the performance of knn regarding the prediction of classifying my test dataset. I dont know how connected the output of apply model as an input the performance classification to know how good is the prediction of my classification.    

 

Tnx

Tagged:

Answers

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn

    Hi,

    RapidMiner contains some very useful tutorials and explanations for beginners. It's a bit like the tutorials you have in games nowadays to explain what to do. I would recommend to check them out: Click on new process, select "Learn" and then start from the beginning to understand the principle workings and meanings of ports, parameters and colors, etc...

    The chapter 3 of Model, Scoring and Validation will exactly match your question. 

     

    Greetings

     Sebastian

  • mehrdadmehrdad Member Posts: 3 Contributor I

    Thank Sebastian,

     

    I've checked some of them and I got results in the case of using kNN as sub process of cross or split validation but I have no idea how to evaluate  the score of the prediction of data which is shown in the attached pic.

     

    But-away tnx for your follow up

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,531 RM Data Scientist

    Hi,

     

    the per port of cross validation is already delivering the performance. 

     

    otherwise- you can simply connect the lab port of Apply Model (2) with the lab port of Performance(2) and look at the result.

     

    ~Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • mehrdadmehrdad Member Posts: 3 Contributor I

    Hi,

     

    Tnx for your reply. u right the cross-validation is already giving us the performance of classification but it is not the performance of predicting test set which is directly connected to the apply model 2. I wanna know that how good is my prediction regarding my classification approach. I wanted to connect out put of apply model 2 into performance directly but I got couple of errors.( it says u dont have label , or u need set criterion, but i dont know how!)

    If you want to know about the dependency of your classification performance with predicting of new test data set (retrieve 08 ), how do u measure it ?

     tnx in advance 

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,531 RM Data Scientist

    Ahhh, in order to apply Performance, the incoming data set needs to have a the label tagged. So you simply need to use a Set Role operator and set your label variable to label.

     

    ~Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
Sign In or Register to comment.