# logistic regression (kernel type : dot) weights question

bingojosjtu
Member Posts:

**5**Contributor II
Hi I have another question,

When I successfully generated the model and weights using logistic regression (dot),

Later I want to check if those weighting make any sense and thus can be applied else where, I started to calculate by hand to see if I can get a match with the output value (i.e. probability)

I tried to use the equation P = 1/(1+exp(-(k_1 * x_1+k_2 * x_2+...+k_n * x_n+b))) method to back calculate the known answer, but some how the value is off by a lot.

My question, is there anything I did wrong? For example, do I need to re-scale my data input in any other way? Is there any document I can check upon, other than the Â "Help" section?

Thank you in advance for the help! Â

P.S. After further looking into those weights, I wonder why the magnitude of those weights for large number attribute are similar to those for small number attribute.

Is that possible that the weights are actually adjusted? Is there a way I can obtain the weights like actual k_1, k_2, k_3...?

When I successfully generated the model and weights using logistic regression (dot),

Later I want to check if those weighting make any sense and thus can be applied else where, I started to calculate by hand to see if I can get a match with the output value (i.e. probability)

I tried to use the equation P = 1/(1+exp(-(k_1 * x_1+k_2 * x_2+...+k_n * x_n+b))) method to back calculate the known answer, but some how the value is off by a lot.

My question, is there anything I did wrong? For example, do I need to re-scale my data input in any other way? Is there any document I can check upon, other than the Â "Help" section?

Thank you in advance for the help! Â

P.S. After further looking into those weights, I wonder why the magnitude of those weights for large number attribute are similar to those for small number attribute.

Is that possible that the weights are actually adjusted? Is there a way I can obtain the weights like actual k_1, k_2, k_3...?

Tagged:

0