The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

classification question

boudi313boudi313 Member Posts: 9 Contributor II
edited November 2018 in Help
hi all

I want to ask a question about classification algorthims in rapid miner 

the qustion is :
can I put 2 classification operators one follow each other eg: knn->NB  or  logistic regrission -> NB is that possible in rapid miner if it yes I want an explaintion for this if not I want to know how to do this in any other way this is important to me because I want it to my master thesis in opinion mining

please answer me

thank you 

Answers

  • MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
    Hi,

    first of all let me state that modern keyboards with the "dot" and "comma" keys are available in your local computer store. Using such a keyboard can greatly improve the readability of your texts ;)

    Concerning your question: I don't fully understand what you mean by "put 2 classification operators one follow each other". Can you be a bit more specific?

    All the best,
    Marius
  • boudi313boudi313 Member Posts: 9 Contributor II
    I mean that I want to use 2 classification operators instead of 1 for more accurcy of the classification  eg:  use KNN operator  then put NB operator

    do you now understand or no 
  • fritmorefritmore Member Posts: 90 Contributor II
    Hi

    if you mean putting them back to back i.e. in series, then why not. You can do it in RM.

    k-nn will output K clusters and the subsequent clustering may produce new clusters out of the K clusters (centroids or whatever).

  • boudi313boudi313 Member Posts: 9 Contributor II
    ok I  have second question is the logistic regrission in rapidminer is it the same  maxixmum entropy or no 
Sign In or Register to comment.