Hi, unfortunately this isn't currently possible with a built in operator. If you don't have too many attributes, you might make the calculations with the Generate Attributes operator. Why are you going to normalize the complete example this way? The attributes will receive different weights per examples and learning will nearly be impossible.
Actually, depending on the learning task the L2 norm can be quite effective. For example say you have a long vector of features and want to check when only a few of many is abnormally different compared to the population trend. When using dot product kernels for example the L2-Norm can allow you to compute the angle between samples rather quickly (dot between normalized vector = cos(theta)). Also there is the added benefit of working with small feature magnitudes that speeds up computations in LibSVM for example. So thats why I'm using it. There are other examples, and I think it will be a great feature to add to the normalization operator.
-Gagi
Sebastian Land wrote:
Hi, unfortunately this isn't currently possible with a built in operator. If you don't have too many attributes, you might make the calculations with the Generate Attributes operator. Why are you going to normalize the complete example this way? The attributes will receive different weights per examples and learning will nearly be impossible.
Hi, I thought about this yesterday evening when driving home and I came to exactly the same conclusion. I've added this on the growing stack of good ideas that are still to implement...
Answers
unfortunately this isn't currently possible with a built in operator. If you don't have too many attributes, you might make the calculations with the Generate Attributes operator.
Why are you going to normalize the complete example this way? The attributes will receive different weights per examples and learning will nearly be impossible.
Greetings,
Sebastian
-Gagi
I thought about this yesterday evening when driving home and I came to exactly the same conclusion. I've added this on the growing stack of good ideas that are still to implement...
Greetings,
Sebastian