Training and prediction elapsed time
I'm working on my thesis essay which is about time series forecasting. Specifically, I've built a framework for stock recommandation based on stocks' historical data. I've tested a lot of regression models to see which one performs better in terms of prediction error and actual gain when it's used for trading.
I'd like to evaluate this framework in terms of scalabilty, measuring the time needed to perform a prediction and the time needed for model learning, related to the number of stocks one wants to forecast. In this way, I could show if the designed framework can be adopted for online analysis (e.g. if the prediction time is really low).
Is there in RapidMiner an operator to measure those quantities (e.g. the time needed for a trained model to perform a record prediction, or the training time - but this is relatively simple to infer)?
I know yet this is a weird question since this time varies with respect to the underlying hardware, but I'd need a graph to show those performances in terms of time elapsed.