Options

"EvolutionaryParameterOptimization and Weka Learner"

_paul__paul_ Member Posts: 14 Contributor II
edited May 2019 in Help
Hi,

trying to perform an evolutionary parameter optimization on the
Weka-RandomForest learner

<operator name="Root" class="Process" expanded="yes">
    <operator name="ExampleSource" class="ExampleSource">
        <parameter key="attributes" value="../data/polynomial.aml"/>
    </operator>
    <operator name="ParameterOptimization" class="EvolutionaryParameterOptimization" expanded="yes">
        <list key="parameters">
          <parameter key="W-RandomForest.I" value="[1.0;100.0]"/>
          <parameter key="W-RandomForest.K" value="[1.0;14.0]"/>
        </list>
        <parameter key="max_generations" value="10"/>
        <parameter key="tournament_fraction" value="0.75"/>
        <parameter key="crossover_prob" value="1.0"/>
        <operator name="IteratingPerformanceAverage" class="IteratingPerformanceAverage" expanded="yes">
            <parameter key="iterations" value="3"/>
            <operator name="Validation" class="XValidation" expanded="yes">
                <parameter key="number_of_validations" value="2"/>
                <parameter key="sampling_type" value="shuffled sampling"/>
                <operator name="W-RandomForest" class="W-RandomForest">
                    <parameter key="I" value="62.714679449413616"/>
                    <parameter key="K" value="12.223632520022772"/>
                </operator>
                <operator name="ApplierChain" class="OperatorChain" expanded="yes">
                    <operator name="Test" class="ModelApplier">
                        <list key="application_parameters">
                        </list>
                    </operator>
                    <operator name="Performance" class="Performance">
                    </operator>
                </operator>
            </operator>
        </operator>
    </operator>
</operator>
produces warnings of that type:

P Jul 9, 2009 6:22:51 PM: [Warning] ParameterOptimization: Cannot evaluate performance for current parameter combination: Cannot instantiate 'weka.classifiers.trees.RandomForest': java.lang.NumberFormatException: For input string: "56.645"
Does this mean that this Weka learner has some problems with arbitrary parameter values
and cannot be used for an evolutionary parameter optimization?

Regards,
Paul
Tagged:

Answers

  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi,
    this probably means, that the Weka Learners don't like double values for it's integer parameter. The problem is, that weka does not specify the parameter types properly, so that most of them becomes just simple strings. If the Weka learner then tries to parse a double value, it dies with this exception.
    So most weka Learners cannot be used with evoultionary optimization for non double attributes. But you could wrap a grid optimization around the evolutionary optimization in order to optimize it anyway and ensure that only integer values are used.

    If that does not fullfill your needs, there are some very dark ways around using a dummy operator, a parameter cloner and two macro operators. But since I don't want to lead you to the dark side of process designing, I'm not posting it here, as long as the above solution might work ;)

    Greetings,
      Sebastian
  • Options
    _paul__paul_ Member Posts: 14 Contributor II
    Hi Sebastian,

    this probably means, that the Weka Learners don't like double values for it's integer parameter.
    This is somehow confusing since I'm trying to optimize the parameter "I" and "K" which are defined
    in RapidMiner as real types (not integer). So, if really just integer parameter are allowed here, this
    should be consistent with RapidMiner's GUI and documentation.  ;)

    You also mentioned the GridOptimization wrapper as work-around. I don't have a clue how to model
    this is RapidMiner. Could you provide me an example.

    Best,
    Paul
  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi,
    the grid parameter optimization is quite easy to use...if you know the evolutionary optimization, the grid optimization lust pretty much the same, but you specify every point which should be tested. So take a look at this process:
    <operator name="Root" class="Process" expanded="yes">
        <operator name="ExampleSource" class="ExampleSource">
            <parameter key="attributes" value="C:\Dokumente und Einstellungen\sland\Eigene Dateien\yale\workspace\sample\data\iris.aml"/>
        </operator>
        <operator name="GridParameterOptimization" class="GridParameterOptimization" expanded="yes">
            <list key="parameters">
              <parameter key="W-RandomForest.I" value="[1.0;100.0;100;linear]"/>
              <parameter key="W-RandomForest.K" value="[1.0;10.0;10;linear]"/>
            </list>
            <operator name="IteratingPerformanceAverage" class="IteratingPerformanceAverage" expanded="yes">
                <parameter key="iterations" value="3"/>
                <operator name="Validation" class="XValidation" expanded="yes">
                    <parameter key="number_of_validations" value="2"/>
                    <parameter key="sampling_type" value="shuffled sampling"/>
                    <operator name="W-RandomForest" class="W-RandomForest" breakpoints="after">
                        <parameter key="I" value="62.714679449413616"/>
                        <parameter key="K" value="12.223632520022772"/>
                    </operator>
                    <operator name="ApplierChain" class="OperatorChain" expanded="yes">
                        <operator name="Test" class="ModelApplier">
                            <list key="application_parameters">
                            </list>
                        </operator>
                        <operator name="Performance" class="Performance">
                        </operator>
                    </operator>
                </operator>
            </operator>
        </operator>
    </operator>
    The second problem you are facing with your setup is, that the random Forest of Weka does not support numerical labels as provided by the polynomial dataset.

    A last word to the inconsistency of the parameter description and evaluation: We can't do anything about it, because we have to rely on what weka does specify. But this seems to be wrong here...

    Greetings,
      Sebastian
  • Options
    _paul__paul_ Member Posts: 14 Contributor II
    Hi Sebastian,

    so I got you wrong.
    I thought that you could build a work-around for the EvolutionaryOptimization using the GridOptimization
    such that the evolutionary approach is still applied at the end.

    Since this is not possible, I'd be curious about the "dark way" you mentioned previously. :-)
    Can macros be used to perform an EvolutionaryOptimization on the Weka learners?

    Regards,
    Paul
  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Ok Paul,
    then let me lead you to the dark side. In fact it has not become half as dark as I suspected, but I hope you can enjoy it anyway.
    <operator name="Root" class="Process" expanded="yes">
        <operator name="ExampleSource" class="ExampleSource">
            <parameter key="attributes" value="C:\Dokumente und Einstellungen\sland\Eigene Dateien\yale\Workspace 4.5\sample\data\iris.aml"/>
        </operator>
        <operator name="ParameterOptimization" class="EvolutionaryParameterOptimization" expanded="yes">
            <list key="parameters">
              <parameter key="DummyFor_I.k" value="[1.0;100.0]"/>
              <parameter key="DummyFor_K.k" value="[1.0;14.0]"/>
            </list>
            <parameter key="max_generations" value="10"/>
            <parameter key="tournament_fraction" value="0.75"/>
            <parameter key="crossover_prob" value="1.0"/>
            <operator name="IteratingPerformanceAverage" class="IteratingPerformanceAverage" expanded="yes">
                <parameter key="iterations" value="3"/>
                <operator name="Validation" class="XValidation" expanded="yes">
                    <parameter key="number_of_validations" value="2"/>
                    <parameter key="sampling_type" value="shuffled sampling"/>
                    <operator name="OperatorChain" class="OperatorChain" expanded="yes">
                        <operator name="DummyFor_K" class="NearestNeighbors" activated="no">
                            <parameter key="k" value="14"/>
                        </operator>
                        <operator name="DummyFor_I" class="NearestNeighbors" activated="no">
                            <parameter key="k" value="19"/>
                        </operator>
                        <operator name="ParameterCloner" class="ParameterCloner">
                            <list key="name_map">
                              <parameter key="DummyFor_K.k" value="W-RandomForest.K"/>
                              <parameter key="DummyFor_I.k" value="W-RandomForest.I"/>
                            </list>
                        </operator>
                        <operator name="W-RandomForest" class="W-RandomForest">
                            <parameter key="I" value="19"/>
                            <parameter key="K" value="14"/>
                        </operator>
                    </operator>
                    <operator name="ApplierChain" class="OperatorChain" expanded="yes">
                        <operator name="Test" class="ModelApplier">
                            <list key="application_parameters">
                            </list>
                        </operator>
                        <operator name="Performance" class="Performance">
                        </operator>
                    </operator>
                </operator>
            </operator>
        </operator>
    </operator>
    Sebastian
  • Options
    BAMBAMBAMBAMBAMBAM Member Posts: 20 Maven
    Hi Sebastian ... I think I understand your "Dark Side" approach with one critical problem - I don't understand how the ParameterCloner turns strings with real numbers in them into strings with integers in them (or even strings with one digit after the decimal point).

    When I try the 2 dummy parameters and 2 dummy operators (disabled) + ParameterCloner approach, I still get these errors:
    Aug 22, 2009 8:49:03 PM: [Warning] EvolutionaryParameterOptimization: Cannot evaluate performance for current parameter combination: Cannot instantiate 'weka.classifiers.trees.REPTree': java.lang.NumberFormatException: For input string: "62.704"
    P Aug 22, 2009 8:49:03 PM: [Warning] EvolutionaryParameterOptimization: Cannot evaluate performance for current parameter combination: Cannot instantiate 'weka.classifiers.trees.REPTree': java.lang.NumberFormatException: For input string: "191.373"
    P Aug 22, 2009 8:49:03 PM: [Warning] EvolutionaryParameterOptimization: Cannot evaluate performance for current parameter combination: Cannot instantiate 'weka.classifiers.trees.REPTree': java.lang.NumberFormatException: For input string: "851.304"
    P Aug 22, 2009 8:49:03 PM: [Warning] EvolutionaryParameterOptimization: Cannot evaluate performance for current parameter combination: Cannot instantiate 'weka.classifiers.trees.REPTree': java.lang.NumberFormatException: For input string: "458.621"
    P Aug 22, 2009 8:49:03 PM: [Warning] EvolutionaryParameterOptimization: Cannot evaluate performance for current parameter combination: Cannot instantiate 'weka.classifiers.trees.REPTree': java.lang.NumberFormatException: For input string: "118.680"


    Here is my XML:
    <operator name="EvolutionaryParameterOptimization" class="EvolutionaryParameterOptimization" breakpoints="after" expanded="yes">
                <list key="parameters">
                  <parameter key="DummyTreeM.M" value="[50;1000]"/>
                  <parameter key="DummyTreeV.V" value="[0.0005;0.005]"/>
                </list>
                <operator name="XValidation" class="XValidation" expanded="yes">
                    <parameter key="number_of_validations" value="3"/>
                    <parameter key="sampling_type" value="shuffled sampling"/>
                    <operator name="OperatorChain" class="OperatorChain" expanded="yes">
                        <operator name="AttributeFilter (2)" class="AttributeFilter">
                            <parameter key="condition_class" value="attribute_name_filter"/>
                            <parameter key="parameter_string" value="pred|ProfitLoss"/>
                            <parameter key="invert_filter" value="true"/>
                        </operator>
                        <operator name="DummyTreeM" class="W-REPTree" activated="no">
                            <parameter key="M" value="118"/>
                        </operator>
                        <operator name="DummyTreeV" class="W-REPTree" activated="no">
                            <parameter key="V" value="0.002"/>
                        </operator>
                        <operator name="ParameterCloner" class="ParameterCloner">
                            <list key="name_map">
                              <parameter key="DummyTreeM.M" value="MainREPTree.M"/>
                              <parameter key="DummyTreeV.V" value="MainREPTree.V"/>
                            </list>
                        </operator>
                        <operator name="MainREPTree" class="W-REPTree">
                            <parameter key="M" value="118"/>
                            <parameter key="V" value="0.002"/>
                        </operator>
                    </operator>
                    <operator name="ApplierChain" class="OperatorChain" expanded="no">
                        <operator name="Applier" class="ModelApplier">
                            <parameter key="keep_model" value="true"/>
                            <list key="application_parameters">
                            </list>
                        </operator>
                        <operator name="ChangeAttributeName" class="ChangeAttributeName" activated="no">
                            <parameter key="old_name" value="prediction(RRRatio)"/>
                            <parameter key="new_name" value="pred"/>
                        </operator>
                        <operator name="AttributeFilter (3)" class="AttributeFilter">
                            <parameter key="condition_class" value="attribute_name_filter"/>
                            <parameter key="parameter_string" value="pred|ProfitLoss"/>
                            <parameter key="invert_filter" value="true"/>
                        </operator>
                        <operator name="PredictionCopy" class="AttributeCopy">
                            <parameter key="attribute_name" value="prediction(RRRatio)"/>
                            <parameter key="new_name" value="pred"/>
                        </operator>
                        <operator name="ProfitLossConstruction" class="AttributeConstruction">
                            <list key="function_descriptions">
                              <parameter key="ProfitLoss" value="if(pred&gt;%{longT}, Rise, if(pred&lt;%{shortT},-Rise, 0))"/>
                            </list>
                            <parameter key="use_standard_constants" value="false"/>
                        </operator>
                        <operator name="Data2Performance" class="Data2Performance">
                            <parameter key="keep_example_set" value="true"/>
                            <parameter key="performance_type" value="data_value"/>
                            <parameter key="attribute_name" value="ProfitLoss"/>
                            <parameter key="example_index" value="1"/>
                        </operator>
                        <operator name="RegressionPerformance" class="RegressionPerformance" activated="no">
                            <parameter key="keep_example_set" value="true"/>
                            <parameter key="root_mean_squared_error" value="true"/>
                            <parameter key="spearman_rho" value="true"/>
                            <parameter key="use_example_weights" value="false"/>
                        </operator>
                        <operator name="MinMaxWrapper" class="MinMaxWrapper" activated="no">
                            <parameter key="minimum_weight" value="0.9"/>
                        </operator>
                    </operator>
                </operator>
                <operator name="ProcessLog" class="ProcessLog">
                    <list key="log">
                      <parameter key="tries" value="operator.MainREPTree.value.applycount"/>
                      <parameter key="ProfitLoss" value="operator.Data2Performance.value.performance"/>
                    </list>
                </operator>
            </operator>


    sorry it's so complicated but I'm trying to get the approach to work with my actual project.

    Perhaps string operators could be used to remove the digits after the decimal points?

    thanks for all your help,
    John
  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi John,
    the parameter cloner does not change anything. Instead it just reads the parameter from the nearest Neighbor operators, which are guaranteed to be integer. The parameter optimization recognizes, that the target parameter types are integer and only generates integer values. Unfortunately the weka learner does specify its parameter types incorrectly: Although the parameter must be integer, it uses a double parameter...


    Greetings,
      Sebastian
  • Options
    BAMBAMBAMBAMBAMBAM Member Posts: 20 Maven
    ah - the trick is to use a non-Weka operator's parameter that has an integer value type... ok I tried it and it works!  tks!

    next trick: is it possible to use ParameterOptimization to optimize macro values?  I haven't been able to figure out how to copy a parameter's value into a macro's value yet...
  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi,
    that's very unmagic. It's just the same way: Copy the desired parameter value into the value parameter of a SingleMacroDefinition operator using the ParameterCloner. The macro operator has to ly after the parameter cloner of course :)

    Greetings,
      Sebastian
  • Options
    BAMBAMBAMBAMBAMBAM Member Posts: 20 Maven
    Thank you, Sebastian!

    I have implemented the solution using the "Dummy Operator" method you outlined.  So instead of using Macros, I use dummy operator variables, and AttributeConstruction to change the value for each example every iteration.  this is turning out to be very slow; I believe it is because the code must reallocate space for a large list/array every iteration, so now I am experimenting with using AttributeConstruction just once (when the data is loaded) and then, in each iteration, using SetData or some other operator to change the values of each example to a value that is dependent upon the prediction generated for that example.  However, SetData doesn't allow the use of formulas.  Is there an operator like AttributeChangeValue that is has the formula-parsing power of AttributeConstruction but doesn't have the large overhead of reallocation of memory space?

    (and also - is using SingleMacroDefinition + ParameterCloner faster than using a dummy operator + ParameterCloner?)

    thanks in advance,
    John
  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi John,
    you could use the MacroConstruction and then use the computed macro as value in the SetData operator...
    There shouldn't be any noticeable speed difference between SingleMacroDefinition + ParameterCloner and using a dummy operator + ParameterCloner. Compared to the usual data mining computation times, that's peanuts.

    And yes, the AttributeConstruction needs to create a new column, why in 4. x the complete data has to be copied every this and then.

    If you really need a fast solution, the creation of a specialised operator is always the fastest solution...

    Greetings,
      Sebastian
Sign In or Register to comment.