🎉 🎉 RAPIDMINER 9.10 IS OUT!!! 🎉🎉

Download the latest version helping analytics teams accelerate time-to-value for streaming and IIOT use cases.

CLICK HERE TO DOWNLOAD

"SVM question"

evgenyevgeny Member Posts: 11 Contributor II
edited May 2019 in Help
given a binary label (e.g. Y/N), with some attributes, i train an SVM model to come up with predictions and corresponding confidence percentages for Y, say.

qn: is there a way to augment (iteratively?) the model such that the resulting confidence percentages are, say, at least x%.

i understand that this may not be possible for all the data points, my question is more if there is a way of feeding this additional criterion directly in to the optimisation?
Tagged:

Answers

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531   Unicorn
    Hi,
    that's exactly what the optimization does anyway. There's no need for changing the optimization function.

    If you want to vary the parameter settings of the learner, you might use a performance measure. that uses the resulting confidences for estimating the performance and then use a parameter optimization operator.

    Greetings,
      Sebastian
  • evgenyevgeny Member Posts: 11 Contributor II
    Sebastian, thank you for your reply.

    If I understood what you propose correctly, I am not sure that it addresses my question. To illustrate my problem:

    id               Class         confidence(Y) MinConfidence       confidence(Y)*
    reading1       N         23.78%               20.00%               x1%
    reading2       Y         42.21%               30.00%               x2%
    reading3       Y         36.61%               40.00%               x3%

    for example, reading2 and readding3 are of class Y. the initial model which I've trained gives them a confidence of 42.21% and 36.61% respectively.

    Is there a way to construct a new model, whereby I would be feeding in an additional variable MinConfidence, such that the newly trained model will produce Confidence(Y)* values equal to or greater than the corresponding MinConfidence values?

    If so, would you be able to illustrate with some code.

    thank you very much for your help.

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531   Unicorn
    Hi,
    sorry, but I don't believe, that you have understood, whats the confidence about. Confidence reflects how sure the learner is, that the example is of that class. You cannot change this confidence without changing the complete classification function.
    But if you already know what should be the minimal confidence, then why not setting it simply as confidence and setting the label depending on this confidence?
    Sorry for asking, but do you really know what you are want to do?

    Greetings,
      Sebastian
  • evgenyevgeny Member Posts: 11 Contributor II
    you are quite right, i am new to the subject and am still finding my way...

    thank you for you time, let me think things through properly before taking any more of it.
Sign In or Register to comment.