The Altair Community is migrating to a new platform to provide a better experience for you. The RapidMiner Community will merge with the Altair Community at the same time. In preparation for the migration, both communities are on read-only mode from July 15th - July 24th, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here.

Options

## Answers

2,531Unicornit's currently not included, but we are planning to implement it in near future. The problem is that the optimization is not that easy as for example the ridge regression or simple linear regression: It does not collapse to a closed form and hence needs some more complex iterative optimization. If you have a suggestion or implementation at hand, feel free to contribute

Greetings,

Sebastian

13Contributor IIBen

13Contributor IILeast Angle Regression(LARS) andLASSOready to serve you at:https://sourceforge.net/projects/rm-featselext/

It's a pretty straight-forward implementation of the LARS-algorithm described in

Least Angle Regressionby Efron et al. 2004 in "The Annals of Statistics".It works so far, but I have not (yet) included the speed-optimization described in a later section of the paper.

Enjoy,

Ben

241Contributor II-Gagi ;D

537MavenI also really like the added screen shots.

241Contributor IIIt seems that RFE reduces the selected features by removing the least weighted features and re-weighting. Backward elimination seems to do the same thing (drop worst features) however is more tied to performance of a specific model and will likely blow up since many combination of dropped features must be tried.

Any input is appreciated. ;D

Thanks,

-Gagi

241Contributor IIThanks,

-Gagi

13Contributor IISo for versions =< 1.0.3 the following holds

The SVM-RFE operator uses the absolute weights of the inner SVM.

In detail: An invisible SVM-Weighting-operator calculates the weights. They may contain negative values.

Then the weights are normalized. The normalization-method maps the absolute values to the range [0..1]. That means negative weights with a large absolute value will a high value (-> 1) after normalization. see AttributeWeights.normalize() for details.

The RFE operator with arbitrary subprocess does not (yet) use absolute values. This means that if you want negative weights with a high absolute value to remain in the selection, you must normalize them or convert them to absolute values.

Wait a second.

...

I added a parameter "use_absolute_weights" to the RFE-operator. It is availalbe in the SVN version (trunk) but not yet released as prebuilt JAR.

Regarding the question of your previous post RFE vs. Backward-Elimination: They are quite different! :-)

First, the similarity. BE and RFE both remove features from the full set of features, BUT:

In every round BE has to evaluate the prediction performance of removing each feature by cross-validation or bootstrap or the like. That's very expensive.

RFE in contrast has only to calculate one weight-vector per round and the removes the feature(s)* with the smallest weight.

The RFE operator usually removes several attributes in one round whereas the BE-operator only removes one feature each round (I'm not 100% sure about that).

Merry Christmas,

Ben

241Contributor III will have to get the latest version once you build the jar. For now I will use the weight by SVM for normalized weights.

Thanks and Merry Christmas,

-Gagi

241Contributor IIIs it possible to return the non normalized weights at the end of your operator. Say I want the 10 best features after RFE but I also want to know the relative ranking of these 10 features. It would be great to output the actual weights of the final iteration rather than just 0/1. ;D

Thanks,

-Gagi

13Contributor III'm sorry, but that's not possible at the moment and would interfere with the purpose of feature

selection. I think you have to apply another round of SVM-Weighting on the selected features.I'll let you know, if I find a workaround for that.

Ben

241Contributor IIThanks Ben!

-Gagi