πŸ₯³ RAPIDMINER 9.9 IS OUT!!! πŸ₯³

The updates in 9.9 power advanced use cases and offer productivity enhancements for users who prefer to code.

CLICK HERE TO DOWNLOAD

LibSVM

farquadfarquad Member Posts: 1 Contributor I
edited November 2018 in Help
Using different kernal functions (linear, polynomial, rbf and sigmoid) different set of support vectors have been extracted, why and how???

with the change of parameter for the kernals different set of support vectors, why and how???

could anyone please explain me this, i would be thankful to you.

thank you very much...

Answers

  • IngoRMIngoRM Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University Professor Posts: 1,750  RM Founder
    Hi,

    phew, explaining this would be equivalent to explaining the whole idea of kernel based learning. Instead of replicating everything which was said and written at least a hundred times, I will give you some hints:

    - let's say the data cannot be divided by a linear hyperplane in its input space but by a polynomial. Then the linear hyperplane will cause a lot of errors and hence support vectors
    - the same applies for other kernel functions or inappropriate kernel parameters

    If you have any difficulties in understanding those two hints I would suggest to learn much more about the way support vector machines work, e.g. at http://www.kernel-machines.org

    Cheers,
    Ingo

Sign In or Register to comment.