Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.

About the LinearRegression operator

Legacy UserLegacy User Member Posts: 0 Newbie
edited November 2018 in Help
Hi,

I am a new user of rapidminer. I tried to use the LinearRegression operator and I ran with my dataset. But in the end I did not get an equation. I am currently trying to train a model.

My process are as follows:

Root

 >CSV Example Source

 >Attribute Filter

 >RemoveUselessAttributes

 >Genetic Algorithm

   >>Operator Chain

      >>>X-Validation

            >>>>LinearRegression

            >>>>OperatorChain(2)

                     >>>>>>ModelApplier

                     >>>>>>RegressionPerformance

            >>>>ProcessLog

 >CSV Example Set Writer


Thanks!

Answers

  • haddockhaddock Member Posts: 849 Maven
    Hi there,

    The optimisation produces a parameter set for the learning operator and anything else you tweak, rather than a model. To keep it simple the process is like this.

    1. Generate a parameter set - you've done that.

    2. Apply that set using the parameter setter operator. You'll need to map between the optimised parameters and the ones you will use. So if I had optimised an SVM and some Validation settings I would have this sort of mapping.

    <operator name="ParameterSetter" class="ParameterSetter" activated="no">
                <list key="name_map">
                  <parameter key="NNValidation" value="NNValidation (2)"/>
                  <parameter key="LIbSVMLearner" value="LibSVMLearner (2)"/>
                </list>
            </operator>

    3. Run the same learner on an appropriate dataset ( careful here ). That will produce the optimised model which has this sort of form.

    31.736 * a1 + 42.948 * a2 + 23.773 * a3 + 3.706 * a4 - 4.184 * a5 - 304.228

    Where a1 etc. are the attribute names.

    With RM a big skill is to keep track of the inputs and ouputs of the operators, because sometimes it is not what you think!

  • Legacy UserLegacy User Member Posts: 0 Newbie
    Hi haddock,

    Thanks for your suggestion.

    So the parameter setter operator should include two learner operators and placed after genetic algorithm?

    Thanks
  • haddockhaddock Member Posts: 849 Maven
    No, just the one learner after setting the parameters.
Sign In or Register to comment.