🎉 🎉 RAPIDMINER 9.10 IS OUT!!! 🎉🎉

Download the latest version helping analytics teams accelerate time-to-value for streaming and IIOT use cases.


Weired learner paramters in EvolutionaryParameterOptimization

_paul__paul_ Member Posts: 14 Contributor II
edited November 2018 in Help

I'm not sure if this is a bug but when I embed the NearestNeigbors learner
into EvolutionaryParameterOptimization, the latter allows to choose among
parameters for NN which are actually not visible in the learner itself. For
example, within the evolutionary optimization I can select
NearestNeighbors.kernel_gamma as parameter to be optimized. But when
I check the learner, there is no such parameter. Here is the code to reproduce:

<operator name="Root" class="Process" expanded="yes">
    <description text="#ylt#p#ygt#This process is also a parameter optimization process like the first one discussed in the meta group. In this case, an evolutionary approach is used for the search of the best parameter combination. This approach is often more appropriate and leads to better results without defining the parameter combinations which should be tested (as for the Grid Search and  the quadratic parameter optimization approaches).#ylt#/p#ygt# #ylt#p#ygt#The parameters for the evolutionary parameter optimization approach are defined in the same way as for the other parameter optimization operators. Instead of a comma separated list of parameters which should be checked the user has to define a colon separated pair which is used as lower and upper bound for the specific parameters.#ylt#/p#ygt# "/>
    <operator name="ExampleSource" class="ExampleSource">
        <parameter key="attributes" value="../data/polynomial.aml"/>
    <operator name="ParameterOptimization" class="EvolutionaryParameterOptimization" expanded="yes">
        <list key="parameters">
          <parameter key="NearestNeighbors.kernel_gamma" value="[0.0;Infinity]"/>
        <parameter key="max_generations" value="10"/>
        <parameter key="tournament_fraction" value="0.75"/>
        <parameter key="crossover_prob" value="1.0"/>
        <operator name="IteratingPerformanceAverage" class="IteratingPerformanceAverage" expanded="yes">
            <parameter key="iterations" value="3"/>
            <operator name="Validation" class="XValidation" expanded="yes">
                <parameter key="number_of_validations" value="2"/>
                <parameter key="sampling_type" value="shuffled sampling"/>
                <operator name="NearestNeighbors" class="NearestNeighbors">
                <operator name="ApplierChain" class="OperatorChain" expanded="yes">
                    <operator name="Test" class="ModelApplier">
                        <list key="application_parameters">
                    <operator name="Performance" class="Performance">
        <operator name="Log" class="ProcessLog">
            <parameter key="filename" value="paraopt.log"/>
            <list key="log">
              <parameter key="C" value="operator.LibSVMLearner.parameter.C"/>
              <parameter key="degree" value="operator.LibSVMLearner.parameter.degree"/>
              <parameter key="error" value="operator.IteratingPerformanceAverage.value.performance"/>


  • cherokeecherokee Member Posts: 82  Guru
    Hi Paul,

    this is not a bug. kNN has this parameter. It is (only?) used when you use KernelEuclideanDistance as numerical measure. The GUI knows that and displays this parameter only when it's needed (try it!). The EvolutionaryParameterOptimization doesn't know that. Thus it is always displayed.

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531   Unicorn
    Hi Paul,
    Michael is totally correct.
    Unfortunately I don't have a clue, how the Optimization Dialog could obey the dependencies, too...

Sign In or Register to comment.