🎉 🎉. RAPIDMINER 9.8 IS OUT!!! 🎉 🎉
RapidMiner 9.8 continues to innovate in data science collaboration, connectivity and governance
Feature and parameter optimization - which comes first?
When I thought about feature and parameter selection, I was reminded of the well known hen and egg problem: Which one comes first?
The feature selection has a nested learner which requires parameters that are not known at the beginning. Based on its result, attributes are selected.
In the next step I do parameter optimization for the following learner. However, the result ist linked to the quality of attributes fed in.
Is there a golden rule of thumb how to approach this dilemma?