Options

similar parameters of JMySVMLearner and LibSVMLearner ?

Legacy UserLegacy User Member Posts: 0 Newbie
edited November 2018 in Help
Hello together,

I have a regression problem where I have a kind of "pseudo-extrapolation" of the data. With Matlab and newrb I reach pretty good results. The same holds for some other publicly available SVMs coded in Matlab. Also Treeboosts of DTREG produce pretty good results.

Furthermore LibSVMLearner of the Rapidminer produces with the simple process "ExampleSource->LibSVMLearner->ExampleSource->ModelApplier " pretty good forecasts for the following parameter set:

svm_type=nu-SVR
kernel-type=rbf
degree=3
gamma=1.0
coef0=0
C=1500.00
nu=0.5
cache_size=80
epsilon=0.0010
p=0.1
shrinking=yes


When I change to JMySVMLearner I never got good results, also when "playing" with the parameters. Admittedly I don´t know the exact meaning of the parameters. Does anybody of you have a good hint for me, which parameter combinations of JMySVMLearner will probably produce good results ? Are their some kind of "transformation rules" from LibSVMLearner  to JMySVMLearner ?

Thank you all,

Sascha

Answers

  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi Sascha,
    a good parameter combination depends on the problem on hand. But some might parameters be meaningless depending on the settings of the other parameters. If you choose a radial kernel degrees cannot be applied. If you point with your mouse on the parameter names, a hint is show explaining when the parameters are applicable.
    I don't know if the implementations use the same mathematical algorithms to optimize the SVM function. Probably parameters cannot translated 1:1, if I remember correctly some even are inversed in their interpretation. But if rbf worked fine with LibSVM, the radial (basis function = rbf) kernel of the JMySVM should work best, too. Vary one parameter per time and try to get a feeling for their meanings.

    Greetings,
      Sebastian
Sign In or Register to comment.