# H20 Variable Importance

Deep Learning model of H20 provides "compute variable importance" choice.

If elected the output of Deep Learning Model list the top ten important attributes. Is there a way to increase this number to top 20 or 100?

If elected the output of Deep Learning Model list the top ten important attributes. Is there a way to increase this number to top 20 or 100?

Tagged:

0

## Answers

563UnicornI am sure that H2O is calculating variable importance for all variables in your dataset, I think its the rapidminer view that is restricting you to see all variables. I can see the top 10 and least 10 variables and their importance. I don't see any option to extend this

@hughesfleming68 any suggestion on this?

Thanks

Varun

https://www.varunmandalapu.com/

11Contributor I563UnicornYou can use explain predictions operator to see which variables impacted your model predictions. I will use this a lot compared to variable importance. One reason is the limitation of H2O variable importance method (Gedeon based) that extracts importance based on weights only from first two layers of a network, but for large networks, it is not good as the deeper networks can influence your variable importance.

Everything has their own limitations

Varun

https://www.varunmandalapu.com/

11Contributor IThe explain predictions operator explain the prediction of each data row. Thus, gathering a general (overall) idea of the important attributes is somewhat challenging. May be attribute frequencies for each prediction can be calculated manually to find the most effective supporting and contradicting attributes.

563UnicornI am trying to work on some feature selection techniques based on this operator, if @IngoRM does it earlier it will be available in RM.

https://community.rapidminer.com/discussion/55351/explain-predictions-ranking-attributes-that-supports-and-contradicts-correct-predictions#latest

Thanks

Varun

https://www.varunmandalapu.com/

11Contributor II'm looking forward to see your solutions