If you are looking for a classic stepwise regression, you can use either Forward Selection or Backward Elimination operators, which are both part of the feature selection operator folder, and use Linear Regression as the inner learner. You can set the selection logic either based on absolute change in performance or associated alpha values.
This graphic depicts forward selection. It also envisions adding all cross-product terms (2nd order interactions) and potentially even higher-order interactions. I'm not sure where you are getting this guidance, but it certainly doesn't represent state-of-the-art best practices for model construction in modern data science, in my view. I'd be very wary of overfitting if you blindly dump all these interactions terms into a regression framework without doing any kind of feature engineering or feature selection.