Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.
Parameter optimisation operators offer different parameter settings
awchisholm
RapidMiner Certified Expert, Member Posts: 458 Unicorn
Hello,
I have noticed that the "Optimize Parameter (Grid)" operator offers more parameters to vary for inner operators when compared to the "Optimize Parameter (Evolutionary)" operator.
For the k-NN operator, I am offered a full range of this operator's parameters when I use the Grid optimizer. When I use the Evolutionary operator the number of possibilities reduces.; it is not possible to vary the nominal measure type for example.
This has the effect that I can't take advantage of the evolutionary operator to find optimum parameter settings so I am slowed down with the brute force grid approach. It's not too bad - I go for a cup of tea while the process is running.
Anyhow, is there a subtle point I am missing that restricts available parameters in the evolutionary case?
regards
Andrew
I have noticed that the "Optimize Parameter (Grid)" operator offers more parameters to vary for inner operators when compared to the "Optimize Parameter (Evolutionary)" operator.
For the k-NN operator, I am offered a full range of this operator's parameters when I use the Grid optimizer. When I use the Evolutionary operator the number of possibilities reduces.; it is not possible to vary the nominal measure type for example.
This has the effect that I can't take advantage of the evolutionary operator to find optimum parameter settings so I am slowed down with the brute force grid approach. It's not too bad - I go for a cup of tea while the process is running.
Anyhow, is there a subtle point I am missing that restricts available parameters in the evolutionary case?
regards
Andrew
0
Answers
the reason is simple: the used optimization scheme is Evolutionary Strategies which works on real numbers only and simply cannot handle nominal values (and hence nominal parameters). For integers, we added a "hack" in order to support them as well but for nominal numbers this is not really possible. In general, the mixture of nominal and numerical values is not the best idea for evolutionary algorithms and best results would be achieved by transforming the nominal parameters into binominals before which can sometimes blow up the search space a lot. Otherwise, the assumption that smaller mutations should be more probable than larger ones would no longer hold.
Cheers,
Ingo
I get it - you mean the parameters to be varied for whatever operator is chosen must be real or integers. Actually this is quite an insight (for me ) since it means I can't just throw an evolutionary operator at a problem and hope for the best; I actually have to think about it.
As it happens, I've been assuming I can work around this but now I see there's a fundamentally good reason for the way it is. If you have the time, I would suggest an explanatory sentence in the documentation or with the operator itself.
regards
Andrew