How can I sum up weights in each training round of a GBT or Random Forest?
I am using a GBT and / or Random Forest algorithm inside an X-Validation inside a parameter optimization.. Now, the Gradient Boosting Trees and Random Forests can give out the attribute weights for each model building. I would like to catch those weights and sum them up from each training round (or cross-validation).. I tried to output the attribute weights from inside the X-Validation with remember / and recall operator outside the X-Validation, and aggregate them with the aggregate operator and sum them up.. in this way, I get an averaged output of the most important attributes for each training round...
but somehow, for each round, my old values seem to dissapear or vanish, as nothing is summed up.. why is that so? Is it because of the parameter optimization algorithm?